Hi Team,
I have a very large table which has 50 million records and each record has 25 columns.
On daily basis, we get a delta file from source system which contains inserts and updates. This file may contain 30k records or so which can be loaded into the table using any ETL tool very easily.
It is a problem when comes to deletes in the source system. The source system can't provide deletes, so as a workaround it sends full dump of keys(4 columns) to to compare against target table and identify the records which don't have match with the file.
I am thinking to load Keys dump into a temporary table and compare again target table to find out deletes. Am I in right direction ?
Please share your thoughts if you come across similar requirement.
Thanks...