Performance testing - What should I change and test against the baseline
We are trying to optimize performance in test before going to prod.
We currently have sga of 9G hard coded around the pools.
So my first idea is to play around with ASSM, maybe test it at 5G, 7G, 9G and 11G
Then take whichever one of these looks good and try db_file_multiblock_read_count at 16 and 32. I read that higher values are beneficial in DW situations, this is OLTP. It is currently 16 in prod but vendor recommended 32 when upgrading to 10g.
Finally we have had some luck in databases by setting the optimizer_enabled_features to 9.2.0 when we upgrade to 10g if the performance is poor.
I can't seem to find a good reference article that says use this as your baseline then test this parameter at x and y, then this parameter at a and b.
The above plan is just something I have derived from colleagues/web/books.
Thanks
MN
Any advice/help/links appreciated