Folks,
We are new to large scale partition management.
We have searched for comparable questions on this forum to find similar requests.
Perhaps it is the search terms, but we ended up deciding to start a new thread.
We have many partitioned tables like the following:
CREATE TABLE foo (
....
ora_archive_state integer
) ENABLE ROW MOVEMENT
PARTITION BY RANGE ( ora_archive_state ) (
PARTITION FOO_LIVE VALUES LESS THAN ( '1' )
);
NOTE: We have chosen to use ora_archive_state as our partition key even though we are running on 11gR2. There is no
clear, native, column in the source data for tagging each batch. Because we are on 11gr2 we can't use ROW
ARCHIVAL (alter table foo row archival) to allow for the possibility (may/may not use) of using automated row archival in 12c
when we finally upgrade.
Now, we receive data in batches from external suppliers. The initial load exists in FOO_LIVE and has ora_archive_state value of 0.
As each new batch arrives, the current LIVE data needs to be moved to a new partition with ora_archive_stage value being the
next available ie 1 (values 2).
One method for doing this is:
ALTER TABLE foo ADD PARTITION FOO_1 VALUES LESS THAN ('2');
With row movement enabled:
UPDATE FOO SET ORA_ARCHIVE_STATE = 1;
COMMIT;
Data is now moved to FOO_1.
FOO_LIVE is now available for the new data. We can do straight inserts with ora_archive_state set to 0.
INSERT INTO FOO (......,ora_archive_state) VALUES (.....,0);
Given that some of our batches have more than 10 million records, we wonder whether the above method is the best and fastest method
for moving LIVE data to a new partition in a production database where we often given only a small window (a few hours) to do the migrate
and import. Could we the new data into a separate table and then EXCHANGE partitions with the new FOO_LIVE data?
Any suggestions or help greatly appreciated.
Simon