Given a table a RSFILE
Columns
NAME DATA TYPE NULL DEFAULT COMMENTS
*ID NUMBER(19,0) No
VERSION NUMBER(19,0) No
DESCRIPTION VARCHAR2(255 CHAR) Yes
FILE_NAME VARCHAR2(255 CHAR) No
SDRMUSER_ID NUMBER(19,0) No
UPLOAD_FILE LONG RAW No
where some of the UPLOAD_FILE entries are between 100M to 400M
Conducting the following
set sqlformat csv
spool rsfile.csv
select * from rsfile order by id asc;
spool off
leads to errors like the following
3A90AD3A904C0049099ABADC70CC460DC59E691932E4B1B6A53869B0A13966F9B0DBD0FE8B0D6A60CFD4676ABB86B667287BA6726089079674DC128F5AE2718B3D6A324736F1B5817D31D103A3BC6722FB6676DF2C3A746CAF5EDDD5A81E5099FF40CD100E1D9BA9CE547A3ABAAB971CF57CAF96EBAA99B69C6A4BF196186B895187906C8BA98E94EA3A121D31DE11226D3EDC64FC36EDB3298F45B8Exception in thread "Thread-43" java.lang.OutOfMemoryError: Java heap space
at java.base/java.util.Arrays.copyOf(Arrays.java:3540)
at java.base/java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:245)
at java.base/java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:751)
at java.base/java.lang.StringBuilder.append(StringBuilder.java:233)
at java.base/java.io.BufferedReader.implReadLine(BufferedReader.java:417)
at java.base/java.io.BufferedReader.readLine(BufferedReader.java:347)
at java.base/java.io.BufferedReader.readLine(BufferedReader.java:436)
at oracle.dbtools.raptor.newscriptrunner.WrapBufferOutputStreamToWriter$OutputRedirectorForSpool.run(WrapBufferOutputStreamToWriter.java:125)
Pipe closed
34 rows selected.
Elapsed: 00:03:30.163
2024-09-03 13:08:47.990 SEVERE oracle.dbtools.raptor.newscriptrunner.ScriptExecutor run java.base/java.io.PipedInputStream.checkStateForReceive(PipedInputStream.java:266)
java.io.IOException: Pipe closed
at java.base/java.io.PipedInputStream.checkStateForReceive(PipedInputStream.java:266)
at java.base/java.io.PipedInputStream.receive(PipedInputStream.java:207)
at java.base/java.io.PipedOutputStream.write(PipedOutputStream.java:125)
at oracle.dbtools.raptor.newscriptrunner.WrapBufferOutputStreamToWriter.write(WrapBufferOutputStreamToWriter.java:70)
at org.fusesource.jansi.io.AnsiOutputStream.write(AnsiOutputStream.java:153)
at java.base/java.io.FilterOutputStream.write(FilterOutputStream.java:139)
at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:125)
at java.base/java.io.BufferedOutputStream.implFlush(BufferedOutputStream.java:252)
at java.base/java.io.BufferedOutputStream.flush(BufferedOutputStream.java:240)
at oracle.dbtools.raptor.newscriptrunner.WrapListenBufferOutputStream.flush(WrapListenBufferOutputStream.java:183)
at oracle.dbtools.raptor.newscriptrunner.WrapListenBufferOutputStream.flush(WrapListenBufferOutputStream.java:183)
at oracle.dbtools.raptor.newscriptrunner.ScriptRunner.run(ScriptRunner.java:331)
at oracle.dbtools.raptor.newscriptrunner.ScriptExecutor.run(ScriptExecutor.java:364)
at oracle.dbtools.raptor.newscriptrunner.ScriptExecutor.run(ScriptExecutor.java:245)
at oracle.dbtools.raptor.scriptrunner.cmdline.SqlCli.processLine(SqlCli.java:1228)
at oracle.dbtools.raptor.scriptrunner.cmdline.SqlCli.runSqlcl(SqlCli.java:1406)
at oracle.dbtools.raptor.scriptrunner.cmdline.SqlCli.main(SqlCli.java:372)
Noted
- Increasing heap space hasn't made any difference in my testing on a 32 GiB ram, 64-bit Windows 11 host, running 64-bit JVM 22.0.2 (OpenJDK). Started with default of 2G. Last ran with JAVA_TOOL_OPTIONS set to -Xms4g -Xmx10g.
- The files embedded as long RAW are JPG, adobe illustrator (AI) files, XLSX, etc. Don't know what the precise limiter is, but am seeing that files between 100MiB and 400MiB are problematic.
Have not read source code yet to see what kind of limiters are in place. Feels like some low-level component is limited by a 2G byte range even though operating in a 64-bit memory space.
Urgency low. Seeking alternative tool set to accomplish the same. Context: archival output of a data model (just in case of migration faults into new app). Thankfully the new application version has no dependency on LONG RAWs; and files are stored in a file system.