Skip to Main Content

SQL & PL/SQL

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

How to handle larger datasets

3391232Mar 31 2017 — edited Mar 31 2017

Hi Team,

I have a very large table which has 50 million records and each record has 25 columns.

On daily basis, we get a delta file from source system which contains inserts and updates. This file may contain 30k records or so which can be loaded into the table using any ETL tool very easily.

It is a problem when comes to deletes in the source system. The source system can't provide deletes, so as a workaround it sends full dump of keys(4 columns) to to compare against target table and identify the records which don't have match with the file.

I am thinking to load Keys dump into a temporary table and compare again target table to find out deletes. Am I in right direction ?

Please share your thoughts if you come across similar requirement.

Thanks...

Comments
Locked Post
New comments cannot be posted to this locked post.
Post Details
Locked on Apr 28 2017
Added on Mar 31 2017
10 comments
1,202 views