Impact analysis of Alter Table
Suppose there are around 50 tables which are to be altered to add some columns to each table. Say, for each table there are around 25-30 packages/procedures using the table. Is there a mechanism/tool you guys use to analyse which objects might be impacted due to altering the tables when we have hundreds of objects to be analysed? For example some procedure might be using a straight Insert into <table> values(c1..c10) instead of using specific column list. Ideally it should not be that way, but the impact anlaysis helps fix that problem too. Right now I can only think of two ways.
1) Manually checking all of them
2) Take a test schema, say for a single table get all the dependencies from dba_source, check their status from dba_objects, add columns, recheck the status, run a compile all script to recompile all the objects, recheck the status in dba_objects.
Of course both the mechanisms are tiring, efforts get huge with the increase of number of tables to be modified and error prone. If you have any easy mechanisms you use or any tool which does this for us, please let me know. Thank you.
Edited by: Clearance 6`- 8`` on Apr 27, 2010 10:12 AM