What are some general practices/best practices developers are using to test processes for performance in Oracle? I seem to get the requirement from management fairly often of...
1. Please create a new business process to calculate currency exchange rate on all prices in Canada
--process gets written and I believe performance is fine. Check explain plan, Appropriate indexes, etc
2. Management is happy but says that this process will get called 40-50 times a second possibly. We should make sure that it is optimized and won't hurt our system in production.
Please test to make sure this will not overload our system.
3. I go...????
At this point then, I am stuck in how to "prove" its a good performing process. Are there any best practices that people handle this kind of situation or is this entire situation just not best practice?
Some thoughts are:
1. Create a new static database and run the process 100,000 times. Ask the DBAs if they see any poor performing situations
2. Do the same as above but try to simulate more real world scenarios. Kick off multiple parallel worker threads and try to overload the process and monitor how it scales
3. Use AWR reports
4. Run the process a lot and then see if ADDM report has any suggestions
5. Use some form of Oracles database reply/workload and test that way
I see a lot of good and flaws in the above suggestions but I don't have a "better" way to go about it. Am I hopeless in this approach of trying to find out it performs well when "performs well" has not been defined? Or are there some basic metrics/best practices I can test?
I am currently running on 1/8th rac Exadata on 11.2.0.4 in my particular example. I have full access to stats and diagnostics licensing as well.