cost of reading flat file i java using string tokenizer
807589Nov 5 2008 — edited Nov 6 2008i am reading a flat file which is pipe seperated .
the record length is 25000 and each record is having 22 fields of different character length.
aslo my requirement is to ensure the length of each field of the record.and insert it into database .
for that i have a function which does
String ensureLength(String str_S, int size_i)
{
str_S = str_S.trim();
int len_i = str_S.length();
if(len_i > size_i)
{
str_S = str_S.substring(0,size_i);
}
else
{
if(len_i < size_i)
{
StringBuffer stbf = new StringBuffer(36);
stbf.append(str_S);
for(int i = len_i;i<size_i;i++)
{
stbf.append(" ");
}
str_S = stbf.toString();
}
}
return str_S;
}
i am passing string to this function by reading tokens (string tokenizer) and the size of that string which is fixed as follows
data.Name_S = ensureLength(line_stk.nextToken(),36);
this is done for each of the rows (approx 25000 rows in the file)
what is problem is that till the size of 32 is i pass like this
data.Name_S = ensureLength(line_stk.nextToken(),32);
this code is working fine taking approx 5-6 minutes to insert all rows in table
but in the case of 36 it is taking more than 50 minutes.
suzzest some alternatives