Skip to Main Content

Java Security

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

Calculating hash values for really big files

843810Jun 11 2004 — edited Jun 13 2004
I am using the following code to calculate the hash values of files
public static String hash(File f, String algorithm)
            throws IOException, NoSuchAlgorithmException {
        if (!f.isFile()) {
            throw new IOException("Not a file");
        }
        RandomAccessFile raf = new RandomAccessFile(f, "r");
        byte b[] = new byte[(int) raf.length()];
        raf.readFully(b);
        raf.close();
        MessageDigest messageDigest = MessageDigest.getInstance(algorithm);
        messageDigest.update(b);
        return toHexString(messageDigest.digest());
    }
Now the problem is, for really big files, 100 MB or over, I get an OutOfMemoryError.

I have used the -Xms and -Xms options to increase the JVM heap size, and untimately made it to work. However, I think this is lame and there is also a limit to the -Xmx option.

Is there any other way I can calculate the hash values of these really big files?

Thanks a lot in advance.
Comments
Locked Post
New comments cannot be posted to this locked post.
Post Details
Locked on Jul 11 2004
Added on Jun 11 2004
3 comments
310 views