Skip to Main Content

DevOps, CI/CD and Automation

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

Load and Read XML file size more than 4GB

taohikoMar 18 2014 — edited Mar 19 2014

Hi All

My environment is Oracle 10.2.0.4 on Solaris and I have processes to work with XML file as below detail by PL/SQL

1. I read XML file over HTTP port into XMLTYPE column in table.

2. I read value no.1 from table and extract to insert into another table

On test db, everything is work but I got below error when I use production XML file

     ORA-31186: Document contains too many nodes

Current XML size about 100MB but the procedure must support XML file size more than 4GB in the future.

Belows are some part of my code for your info.

1. Read XML by line into variable and insert into table

LOOP
   UTL_HTTP.read_text(http_resp, v_resptext, 32767);
   DBMS_LOB.writeappend (v_clob, LENGTH(v_resptext), v_resptext);

    END LOOP;

    INSERT INTO XMLTAB VALUES (XMLTYPE(v_clob));

2. Read cell value from XML column and extract to insert into another table

DECLARE
   CURSOR c_xml IS
   (SELECT  trim(y.cvalue)
   FROM XMLTAB xt,
   XMLTable('/Table/Rows/Cells/Cell' PASSING xt.XMLDoc
   COLUMNS
   cvalueVARCHAR(50)PATH '/') y;

    :

    :

    BEGIN

    :

    :

   OPEN c_xml;
   FETCH c_xml INTO v_TempValue;
   <Generate insert statement into another table>
   EXIT WHEN c_xml%NOTFOUND;
   CLOSE c_xml;

    :

    END

And one more problem is performance issue when XML file is big, first step to load XML content to XMLTYPE column slowly.

Could you please suggest any solution to read large XML file and improve performance?

Thank you in advance.

Hiko      

This post has been answered by taohiko on Mar 19 2014
Jump to Answer
Comments
Locked Post
New comments cannot be posted to this locked post.
Post Details
Locked on Apr 16 2014
Added on Mar 18 2014
3 comments
4,271 views