I'm writing a program that reads a txt file word by word and can't find information on if scanner or string tokenizer is going to be faster.
Right now, this is the part of code I'm using and it is running extremely slow:
try{
BufferedReader input = new BufferedReader(new FileReader(filename));
String line,currentWord;
int counter=0;
while((line=input.readLine())!=null){
Scanner scanner = new Scanner(line);
counter++;
if((counter%5000)==0){
System.out.print(".");
}
if(scanner.hasNext()){
currentWord=scanner.next();
try{
cache.removeObject(cache.containsObject(currentWord));
}catch(IndexOutOfBoundsException e){
}
cache.addObject(currentWord);
}
scanner.close();
}
input.close();
}catch(IOException e){
System.out.println("Could not read file.");
e.printStackTrace();
}
Also, is BufferedReader the best option for speed here, or should I use scanner as well for the lines?
Thank you.