Tune Query with Millions of Records
757287Jul 21 2010 — edited Jul 22 2010Hi everyone,
I've got an Oracle 11g tuning task set before me and I'm pretty novice when it comes to tuning.
The query itself is only about 10-15 lines of SQL, however, it's hitting four tables, one of them is 100 million records and one is 8 million. The other two are pretty small comparatively ( 6,000 and 300 records). The problem I am having is that the query actually needs to aggregate 3 million records.
I found an article about using the star_transformation_enabled = true parameter, then on the fact table I set all the foreign key to bitmaps and the dimensions have a standard primary key defined on the surrogate key. This strategy works but it still takes a long time for the query to crunch the 3 million records (takes about 30 minutes).
I know there's also the option of doing materialized views and using query re-write to take advantage of the MV, but my problem with that is that we're using OBIEE and we can't control how many different variations of these query's we see. So we would have to make a ton of MVs.
What are the best ways to tackle high volume queries like this from a system wide perspective?
Are there any benchmarks for what I should be seeing in terms of a 3 million record query? Is expecting under a minute even reasonable?
Any help would be appreciated!
Thanks!
-Joe