Hi Experts,
I've been working as a DBA for 10+ years, but I started working with Golden Gate this year. I'm doing a migration from SQL Server to Oralce. The Database has several tables with over 100 million records and varchar(max) fields, which will be converted to CLOB in Oracle. In this scenario I have two tables that have over 500 million records and are consuming 15TB of space on SQL Server. I've done tests using FILTER (RANGE()) to try to improve performance, but after a certain load time the amount of records copied per hour drops considerably. Another test was using FILTER(WHERE()), but it seems that the GG loads the entire table to a cursor to then execute the existing condition in the WHERE.
I would like the help of the community to try to improve the performance of this process.
The source SQL Server and target Oracle servers have 16 processors and 256GB RAM.
In advance thank you for your help.