Hi Experts,
I have a scenario where I need to read the file from the FTP server which contains >2 million records , clean up the existing records from a table using TRUNCATE statement( which also has > 2 million records ) and inserts data from the file on to the table.
Here is how I developed:
1. Copy the file from the FTP server to local server path. Since it is a huge file admin does not want to keep the FTP connection open for long time. Hence I am copying it to local server.
2. Execute TRUNCATE table proc. Retaining same step how legacy system worked.
3. Using chuck read operation on File Adapter to read the records on the file.
4. Insert them on to the table with CHUNKSIZE = 50K with dehydrate option to beat 15 mins JTA timeout.
5. Once done move the file to Archive location.
I have enabled the following to roll back the transaction. Not sure if these works for this huge load. This is for your reference:
<property name="bpel.config.transaction" type="xs:string" many="false">required</property>
<property name="bpel.config.oneWayDeliveryPolicy" type="xs:string" many="false">sync</property>
<property name="partnerLink.dbTruncTable.idempotent" type="xs:string" many="false">true</property>
<property name="partnerLink.dbMIRToCMS.idempotent" type="xs:string" many="false">true</property>
<property name="partnerLink.fileReadQOF.idempotent" type="xs:string" many="false">true</property>
My question here is I need to populate the old records on the table if there are any issues in the transaction during or after the TRUNCATE step.
Please advise. Any help is much appreciated!!
Please let me know if you need more information.
Thanks