Hi,
At one customer site we see generated statements, actually reports. The parsetime for such a statement is over an hour, if it finishes at all.
It is possible we see "ORA-04031:" when we run out of memory in the shared pool.
How big these statements are is hard to tell, since it depends on formatting. With sql developer formatting i get in one typical example > 130000 lines.
The statements are constructed relatively simple.
It seems to be a kind of change report where columns from different tables are retrieved.
At the beginning is a big case statement where a meaningful name is generated for a value followed by this values. I counted 7400 case entries as per statement in one case.
I addition we have a number of big inlists.
All this is running against a union view of 55 Tables.
In other words: If i want to stress the parser I would construct a statement exactly like this.
However, one hour seems to be a unrealistically long parse time.
Even though that statement needs to be rewritten, but this will take time.
I want to know if there is any quick fix like increasing the shared pool a lot. (Which I can't test unfortunately any time soon due lack of memory).
In a 10046 trace I see some wait for SGA: allocation forcing component growth, but no recursive statement that takes some time. The 10053 trace does not help a lot, since time seems to be spent in real parsing. The transformation step FPD (Filter Predicate Pushdown) takes some time, but it is 20% of total only.
I was using "_fix_control"='16923858', but did not find any timer entry in the trace.
Any ideas how to speed up the parse time?
Database version is 19.7. Shared Pool size is 20GB
Thanks
Lothar