Hi,
I have a table which have 300+ columns and have 13 million rows. It is on a 32 kb block size. This is a table in datawarehouse environment. There no# of rows in the table haven't changed much but I see that the time taken to collect statistics have increased significantly.
Initially it took only 15 minutes (with the same 13M rows) now it runs for 4+ hours. The max parallel servers is 4 (which is unchanged). The table is not partitioned.
OS: HP UX Itanium
Database: Oracle 11g (11.2.0.2)
Command is:
exec dbms_stats.gather_table_stats(ownname=>'ABC',tabname=>'ABC_LOAD',estimate_percent=>dbms_stats.auto_sample_size,cascade=>TRUE,DEGREE=>dbms_stats.auto_degree);
I would like to understand:
1) What could have been the causes of this change in time. 15 minutes to 4+hours ? What all should I look for?
2) How can we gather statistics of huge table at a faster rate?
Thanks in advance
Cherrish Vaidiyan
Update:
I posted this is SQL/PLSQL and i was requested to post this in Database-General.
2403641