I am using the following code to calculate the hash values of files
public static String hash(File f, String algorithm)
throws IOException, NoSuchAlgorithmException {
if (!f.isFile()) {
throw new IOException("Not a file");
}
RandomAccessFile raf = new RandomAccessFile(f, "r");
byte b[] = new byte[(int) raf.length()];
raf.readFully(b);
raf.close();
MessageDigest messageDigest = MessageDigest.getInstance(algorithm);
messageDigest.update(b);
return toHexString(messageDigest.digest());
}
Now the problem is, for really big files, 100 MB or over, I get an OutOfMemoryError.
I have used the -Xms and -Xms options to increase the JVM heap size, and untimately made it to work. However, I think this is lame and there is also a limit to the -Xmx option.
Is there any other way I can calculate the hash values of these really big files?
Thanks a lot in advance.