jdbc-fetch-size and extremely large dataset (35GB table)
542526May 6 2011 — edited May 6 2011Hi
I'm trying to fetch an extremely large data set and desired behavior is that it keeps running without out-of-memory exception (i.e. it's okay that it takes days but don't blow up on out of memory exception)*
The job runs on a 64 bit machine with 16 processors.
For testing purposes, I used a table 35GB in size (of course this wouldn't be our typical fetch but just to stress JDBC)
+??? Java exception occurred:+
+ java.lang.OutOfMemoryError: Java heap space+
+ at java.util.Arrays.copyOf(Unknown Source)+
+ at java.util.Vector.ensureCapacityHelper(Unknown Source)+
+ at java.util.Vector.addElement(Unknown Source)+
+ at+
+ com.mathworks.toolbox.database.fetchTheData.dataFetch(fetchTheData.java:737)+
+ Error in ==> cursor.fetch at 114+
+ dataFetched =+
+ dataFetch(fet,resultSetMetaData,p.NullStringRead,tmpNullNumberRead);
In light of this problem, I added jdbc-fetch-size=999999999 to my JDBC connection string.
'com.microsoft.sqlserver.jdbc.SQLServerDriver','jdbc:sqlserver://SomeDatabaseServer:1433;databaseName=SomeDatabase;integratedSecurity=true;jdbc-fetch-size=999999999;'
Slightly different error reported.
+??? Java exception occurred:+
+ java.lang.OutOfMemoryError: GC overhead limit exceeded+
+ at java.lang.Integer.toString(Unknown Source)+
+ at java.sql.Timestamp.toString(Unknown Source)+
+ at+
+ com.mathworks.toolbox.database.fetchTheData.dataFetch(fetchTheData.java:721)+
+ +
+ Error in ==> cursor.fetch at 114+
+ dataFetched =+
+ dataFetch(fet,resultSetMetaData,p.NullStringRead,tmpNullNumberRead);+
Any suggestion?
REF:
JDBC http://msdn.microsoft.com/en-us/library/ms378526.aspx
32bit vs 64 bit: http://support.microsoft.com/kb/294418
Edited by: devvvy on May 6, 2011 1:10 AM