I've been struggling with an error that I have been getting, and I can't figure out why. The error is:
Caused by: java.lang.OutOfMemoryError
at java.io.FileInputStream.readBytes(Native Method)
at java.io.FileInputStream.read(FileInputStream.java:174)
at DiskUtility.readFile(DiskUtility.java:214)
The code for this is:
public byte[] readFile( File file ) throws Exception
{
FileInputStream stream = new FileInputStream(file);
byte[] bb = new byte[stream.available()];
stream.read(bb);
stream.close();
return bb;
}
I have my application set to have a -Xms512M and -Xmx1024M, and when I run it, however, it doesn't look like it is allocating any more than 200 megs before it decides to throw this error.
When I run -X:loggc the GC reports that my max heap size is actually 1024M, and it never uses more than 200M but I do not see any errors.
I am attempting to read in an extremely large file, it is 128M, or 128000004 bytes. Is there a limitation on reading large files all at once, is there a safer way to do this? It can obviously allocate all that memory into a byte array, so I'm wondering why it would choke on reading the file. If I try smaller files (around 7-40 meg) then I don't get this issue.
The other interesting part is, normally when I get OOM errors, they tend not to have a stack trace. In this case, I have a stack trace that I can use.
Finally, the last confusing part, is that if I DECREASE the -Xmx setting to only be 512MB, then I can process this file.
My specs are:
J2SDK 1.4.2_03
Dual Pentium 3 1.5 GhZ
2 GB RAM
Windows 2000 Server
Any help or pointers are appreciated, I'm stumped on this one.
Mitch