Skip to Main Content

Oracle Database Discussions

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

Archiving and purging data from huge tables

1004573Apr 22 2013 — edited Apr 22 2013
Hello,
I'm currently working on a project which is to archive the old data and then purge the same data from the main table.

Here is a detail description:
There are around 50 odd tables from which I would need to archive the old data(matching certain filter conditions...not date based). Meaning I have to store the data in a temp table. Once stored in temp table then I would have to delete those rows from the main table. This temp table will be later exported and stored on Archive database(a separate database).

These tables are very huge. One of the table is actually 250 GB in size. And all these tables have many indexes built - both normal and bitmap.
The 250 GB size table has 40 million rows that need to be archived and purged. The total number of rows in the table are 540 million.
On this table alone there are 50 bitmap indexes and 2 normal indexes. This table is partitioned based on date column.This date column is not used/useful in identifying the old data.
There are around 20 tables which are quite similar in size to the above described table. Rest of them are little small when compared to the above table.

Here is a twist to the tale. We have to execute this activity over a weekend which gives us about 48 hours time to complete the activity.
I want your suggestions/advices on What are the best possible ways to handle this activity. Most importantly should be able to complete within the specified 48 hour window.

The solution what we are now thinking of is:
1. Create the temp table ---Create tmp_tbl as select * from main_table where <<conditions identifying old data>>
2. Once the temp table is created. Make copy of indexes that exist on the main table and eventually drop them.
3. Execute a PL/SQL script to perform the bulk delete from main table and commit for every 100000 rows.
4. Once the bulk delete is finished then recreate the indexes on the main table using the copy made at earlier step.
Our main worry is about the step#4. Considering the size of these tables and the number of indexes to be built,we are not sure how long the index recreation will run for each table.

Worst case scenario, depending on the possibilities we may have to split the activity in to 2-3 phases spreading across 2-3 weekends. Even then we are not sure whether we will be able to pull off this activity.

Please share your suggestions or any other ways to execute this activity efficiently.

The database we are using is Oracle 10g.

Thanks
Ravi
Comments
Locked Post
New comments cannot be posted to this locked post.
Post Details
Locked on May 20 2013
Added on Apr 22 2013
2 comments
833 views