Keep consistent Execution PLAN in the highly volatile data environment
NMAug 27 2010 — edited Aug 30 2010Hi,
In our Env we have high volatile data due that statistics are getting stale.
This how our env work.
Beginning of the Day on Certain table which most of the data get inserted
Tibex_order at 9 AM it have 100 records
Tibex_order at 10 AM it have 100000 records
Tibex_order at 1pm it have 1million records
Tibex_order at 2pm it have 3million records
Tibex_order at 4pm it have 4million records
Tibex_order at 5pm it have 5million records
Now we will generate the stats before truncateing tables Tibex_order which had 5million records and it comes done to 0 records after EOD(end of Day)
Same process happens on another table Tibex_Quote.
Now the problem arises next day when table had 1 million actual but the stats shows as 5 million and when user does the Query it takes 2 min and application fails and execution plan also changes.I can see lot of IO and DB Sequential Reads and DB perform very badly.
Kindly guide me how overcome this kind of issues.
Regards
NM