We have a process that uses an XML data store to store XML documents. We have seen some performance issues and I thought I would post here to see if anyone that has any experience with Oracle XML data stores could assist.
Some general background:
The process was written about 8-10 years ago in Oracle 8. Currently our database is at 11g.
We are using ExtractValue for our SQL to access the data, and for indexing.
We are using XMLType to define the data store.
We are deposting XML files using Oracle FTP.
We are using stored procedures for the processing code and extract code.
The process works as follows:
A file is deposited via Oracle FTP and a stored procedure is kicked off to try find a matching document in the data store. This is based on flexible criteria that is setup on a per-customer/trading partner basis.
If a match is found it is processed and the documents that have been matched are moved to a match table.
We do periodic extracts of the data that extract the data in the documents based on criteria per customer/trading partner.
Our issue is volume - small numbers of files are handled okay, but once we get into larger numbers (and 'large' here is defined as 500-1000 files, so not really that large), it takes significant time to run. A test that deposits 1008 documents, 504 of which match, and does an extract takes 2.5 hours to run. We were living with this up to now because we had no customers who ran large numbers of files at the same time.
We recently attempted to implement TDE encryption and Oracle compression on the XML data store and associated tables. The processing time for that performance test went from 2.5 hours to 7 hours.
After going several rounds with Oracle over encryption being the issue, we are now attempting to tune the SQL/indexes/data store to improve performance.
Any suggestions from experience on how to tune this type of processing would be welcome. Thanks.