Often people will write to ask why some database tool such as SQL Developer, DBArtisan or Toad when working against Oracle 10g, 11g, or 12c databases seems generally sluggish or that some screens may appear to lock up or take as long to refresh as the refresh cycle. They are often especially frustrated since this problem did not occur with Oracle versions prior to 10g. So what is the database tool doing that’s wrong (i.e. is it a bug)?
The short answer is that this is an Oracle DBA best practices issue - and not a database tool problem per se. The database tool just happens to expose the symptoms since it makes such frequent access to data dictionary views and tables. The underlying problem generally effects all database applications to a varying degree – it’s just exposed most prominently via the database tool.
Oracle versions prior to 10g offered two query optimizers: cost and rule – with rule based being the default and fallback, plus also quite importantly internally used for most access to the data dictionary. While 10g and beyond still offer both optimizers, all access to the data dictionary is via the cost based optimizer, hence the data dictionary needs statistics gathered in order to function efficiently.
The common misconception (and source of the problem) is that people think Oracle 10g does this “auto-magically” as a background task (i.e. GATHER_STATS_JOB). However, the database does not – well not reliably anyway. As a result 90+ percent of the time the database tool users experience sluggish performance with Oracle 10g, 11g, or 12c database that is the root cause. Most times it takes a “heated debate” between the database tool users, their DBAs, and myself to convince them to try the quick and very simple fix – which takes just two minutes to perform and then successfully test.
Here’s the Oracle 10g documentation which most times finally gets them to try our fix:
Oracle® Database Performance Tuning Guide
10g Release 2 (10.2)
Chapter: 14 Managing Optimizer Statistics
- 188.8.131.52 When to Use Manual Statistics
Automatic statistics gathering should be sufficient for most database objects which are being modified at a moderate speed. However, there are cases where automatic statistics gathering may not be adequate…
Another area in which statistics need to be manually gathered are the system statistics. These statistics are not automatically gathered. See "System Statistics" for more information.
Statistics on fixed objects, such as the dynamic performance tables, need to be manually collected using GATHER_FIXED_OBJECTS_STATS procedure.
So here’s the recommended fix (performed via a DBA privileged account):
- exec dbms_stats.gather_dictionary_stats;
- exec dbms_stats.gather_fixed_objects_stats;
These two commands should probably be run about once per week for best results. Of the two commands, the first is far more critical as it handles the “ALL/DBA/USER_*” views and V$ tables – which are accessed constantly from within the database tool.
The second command is for the X$ tables, which are primarily referenced only by some advanced DBA features.
Please consider getting your DBA to make these calls either a scheduled PL/SQL job or possibly implementing them via “on instance start-up” database level trigger.
Now to give you an example of how much difference this process can make – we've clocked screens in the database tool that run quickly on pre-10g, but take very long on a default install of 10g (sometimes as long as 45 minutes). Once the proper data dictionary statistics are collected, the database tool performance equals or exceeds the pre-10g timings.