Skip navigation

Execution time threshold for auto-reoptimization

score 100
You have not voted. Active

The auto-reoptimization feature, based on cardinality feedback, tries to find a better execution plan when execution shows that cardinality estimations were far from the observed one. This is generally good but some cases exists where a plan optimized on bad estimations is actually faster. An example here: https://blog.dbi-services.com/when-automatic-reoptimization-plan-is-less-efficient.

The goal of the optimizer here is to always try to find a faster plan. But in real life, this is not what users want. They want stability and execution time below a defined performance criteria. For example, for an interactive screen, I accept response time when it is not longer than 200 milliseconds. I don't care that the optimizer can find one that executes in 50 milliseconds. But I'll really be upset if it comes with one that takes 1 second.

 

So, my suggestion is to have an hint that defines the performance critera threshold, such as /*+ TARGET_TIME_MILLISECONDS(200) */. When the execution is lower than this threshold, the optimizer will not try to re-optimized it, whatever the difference in estimations are. When execution is longer than this threshold, all new optimizer features can try to find better. Going further, logging all queries that are higher that the threshold will help tuning, for example always include them in Statspack/AWR snapshot, mark them for the automatic tuning tasks, etc.

 

IMO, this would help to continue to add new optimizer 'intelligent' features while limiting the regressions on the critical queries where the performance criteria has been clearly defined.

Comments

Vote history