Skip navigation

I’ll be speaking at the annual IOUG Collaborate Conference in Las Vegas April 3rd and 5th. To quote their website:


The Independent Oracle Users Group (IOUG), the Oracle Applications Users Group (OAUG) and Quest International Users Group (Quest) present COLLABORATE 17: Technology and Applications Forum for the Oracle Community. As an educational conference, COLLABORATE 17 helps users of the full family of Oracle business applications and database software gain greater value from their Oracle investments.


Created by and for customers, COLLABORATE 17 offers an expert blend of customer-to-customer interaction and insights from technology visionaries and Oracle strategists. Expand your network of contacts by interacting with Oracle customers, solutions providers, consultants, developers and representatives from Oracle Corporation at COLLABORATE 17.


Here are my presentation topics, dates and times:


  • Oracle Utilities as Vital as Ever
    • Apr 3, 2017
    • 12:00 PM–12:30 PM
    • Palm D
    • Session ID: 762
    • Why are books like "Advanced Oracle Utilities: The Definitive Reference" imperative for today's DBA's and senior developers? The answer: Oracle has become so robust and complex; where many new features are implemented not as SQL commands, but as either PL/SQL packages and/or stand alone utilities. Thus to best leverage all that Oracle has to offer, one must know all those non-SQL command interfaces for those new features. This presentation will demonstrate a few critical examples of such features and their usage. The goal will be to inspire attendees to realize that SQL alone is no longer sufficient - that there are many other commands and/or PL/SQL packages to master.


  • Productivity Enhancing SQL*Plus Scripting Techniques
    • Apr 5, 2017
    • 12:00 PM–12:30 PM
    • Palm D
    • Session ID: 759
    • No matter what other Oracle or 3rd party database tools Oracle DBA's and developers might embrace, SQL*Plus remains a mainstay. However many don't fully realize it's true efficiency potential by not mastering advanced techniques such as Dynamic SQL Scripting (i.e. scripts to generate and run scripts). Oracle ACE Bert Scalzo will demonstrate such techniques and provide example scripts to increase any SQL*Plus users productivity. Furthermore some useful, advanced Unix/Linux shell scripting techniques will be highlighted as well.


  • Flash Based Methods to Speed Up DW & BI Oracle Databases
    • Apr 5, 2017
    • 11:00 AM–12:00 PM
    • Jasmine D
    • Session ID: 761
    • DW & BI databases are growing ever bigger while analytical users continue demanding fast performance. While Exadata may be a great solution, not everyone can afford such a capital investment. Oracle ACE Bert Scalzo will highlight simple yet effective methods to utilize flash based storage technology to significantly improve performance with the least cost, effort and interruption. In some cases it's possible to get near Exadata like performance without as many zeros added to the cost.

I’ll be speaking at the annual Hotsos Symposium in Dallas TX February 27th and 28th. To quote their website:


Hotsos is the best conference in the Americas devoted to Oracle system performance. The combination of topic focus, small audience size, and large speaker list make the Hotsos Conference an unsurpassed networking opportunity.

The Hotsos Symposium reaches for a new technical high with presentations focused on techniques, experimental results, and field-tested scripts that attendees can take home and apply to real-world tasks. The speakers routinely offer an excellent balance between database principles and problem-solving techniques. The speakers also show you how to determine the difference between reliable information and bad advice around Oracle performance optimization.


Here are my presentation topics, dates and times:


  • Optimizing Data Warehouse AdHoc Queries against Star Schemas
    • Monday, February 27, 2017
    • 1:00pm to 2:00pm CST
    • Hall B & Virtual Hall B
    • Attendees will learn optimal techniques for designing, monitoring and tuning "Star Schema" Data Warehouses in Oracle11g and 12c. While there are numerous books and papers on Data Warehousing with Oracle, they generally provide a 50,000 foot overview focusing on hardware and software architectures -- with some database design. This presentation provides the ground level, detailed recipe for successfully querying tables whose sizes exceed a billion rows. Issues covered will include table and index designs, partitioning options, statistics and histograms, Oracle initialization parameters and star transformation explain plans. Attendees should be DBA's familiar with "Star Schema" database designs and have experience with Oracle 11g, and some exposure to Oracle 12c.


  • Successful Dimensional Modeling of Very Large Data Warehouses
    • Tuesday, February 28, 2017
    • 1:00pm to 2:00pm CST
    • Hall B & Virtual Hall B
    • Attendees will learn successful techniques for dimensional modeling of very large Data Warehouses using traditional Entity Relationship diagramming tools. While there are numerous new modeling conventions and tools, Entity Relationship modeling tools have proven best at real-world database design and implementation. This presentation provides the ground level, detailed recipe for the optimal dimensional modeling techniques of tables whose sizes exceed 500 million rows. Issues covered will include "Star Schemas", fact and dimension tables, plus aggregate table design and population. Attendees should be DBA's or senior developers familiar with Oracle database design and any form of data modeling.

Often people will write to ask why some database tool such as SQL Developer, DBArtisan or Toad when working against Oracle 10g, 11g, or 12c databases seems generally sluggish or that some screens may appear to lock up or take as long to refresh as the refresh cycle. They are often especially frustrated since this problem did not occur with Oracle versions prior to 10g. So what is the database tool doing that’s wrong (i.e. is it a bug)?


The short answer is that this is an Oracle DBA best practices issue - and not a database tool problem per se. The database tool just happens to expose the symptoms since it makes such frequent access to data dictionary views and tables. The underlying problem generally effects all database applications to a varying degree – it’s just exposed most prominently via the database tool.


Oracle versions prior to 10g offered two query optimizers: cost and rule – with rule based being the default and fallback, plus also quite importantly internally used for most access to the data dictionary. While 10g and beyond still offer both optimizers, all access to the data dictionary is via the cost based optimizer, hence the data dictionary needs statistics gathered in order to function efficiently.


The common misconception (and source of the problem) is that people think Oracle 10g does this “auto-magically” as a background task (i.e. GATHER_STATS_JOB). However, the database does not – well not reliably anyway. As a result 90+ percent of the time the database tool users experience sluggish performance with Oracle 10g, 11g, or 12c database that is the root cause. Most times it takes a “heated debate” between the database tool users, their DBAs, and myself to convince them to try the quick and very simple fix – which takes just two minutes to perform and then successfully test.


Here’s the Oracle 10g documentation which most times finally gets them to try our fix:

Oracle® Database Performance Tuning Guide
10g Release 2 (10.2)

Chapter: 14 Managing Optimizer Statistics

  1. When to Use Manual Statistics

Automatic statistics gathering should be sufficient for most database objects which are being modified at a moderate speed. However, there are cases where automatic statistics gathering may not be adequate…

Another area in which statistics need to be manually gathered are the system statistics. These statistics are not automatically gathered. See "System Statistics" for more information.

Statistics on fixed objects, such as the dynamic performance tables, need to be manually collected using GATHER_FIXED_OBJECTS_STATS procedure.


So here’s the recommended fix (performed via a DBA privileged account):

  • exec dbms_stats.gather_dictionary_stats;
  • exec dbms_stats.gather_fixed_objects_stats;


These two commands should probably be run about once per week for best results. Of the two commands, the first is far more critical as it handles the “ALL/DBA/USER_*” views and V$ tables – which are accessed constantly from within the database tool.

The second command is for the X$ tables, which are primarily referenced only by some advanced DBA features.


Please consider getting your DBA to make these calls either a scheduled PL/SQL job or possibly implementing them via “on instance start-up” database level trigger.


Now to give you an example of how much difference this process can make – we've clocked screens in the database tool that run quickly on pre-10g, but take very long on a default install of 10g (sometimes as long as 45 minutes). Once the proper data dictionary statistics are collected, the database tool performance equals or exceeds the pre-10g timings.

For far too long, database administrators and developers have focused on explain plans for tuning SQL statements that perform poorly. But with Oracle 11g and 12c, explain plans can quite possibly become a thing of the past. Furthermore, 12c can perform the entire tuning process automatically – with minimal time spent during the initial SQL development and maybe 1/10th the effort spent addressing any problematic SQL that somehow does find its way into production.


Does this all sound too good to be true? Then read on …


Let’s start by defining what an explain plan is – at least conceptually. The explain plan is the “approximate algorithm” that the Oracle database engine will attempt to “execute” in order to perform the requested database operation. Since SQL is a non-procedural, set oriented language – the explain plan is the only way to “guestimate” what the SQL code will actually do, because the code itself only expresses what is to be done – and not how. The explain plan exposes both the algorithm and indexes (if any) that the database engine would utilize during actual execution.


Over the years (and major database versions), Oracle has tried to make SQL optimization more stable and reliable than simple explain plans. An excerpt from the book Oracle 11g New Features cleanly delineates this progression nicely – which has been as follows:


  • Explain plans for Oracle versions <= 7.X
  • Plan stability for Oracle versions 8i and 9i
  • SQL Profiles for Oracle version 10g
  • SQL Plan Management for Oracle version 11g and 12c
  • I have to wait in line for my turn to be helped
  • The office may not have current information
  • The person helping me might be misinformed
  • I could say scenic route whereas the family simply wants to get there quickly
  • Things can change between getting the TripTik and starting/ending my trip
  • I must possess both an automobile and valid drivers license to make the trip
  • I’m purposefully eliminating far better methods or mode choices (e.g. flying)


Believe it or not, the exact same cons can often exist for explain plans – as follows:


  • During execution phase, the explain plan must be derived
  • Step #1 requires some resources, so there could be a delay
  • The statistics and/or histograms could be out of date (i.e. stale)
  • SQL may so malformed that fast explain plan derivation could be suboptimal
  • SQL code might contain hints that are suboptimal and yet take precedence
  • The fundamental performance conditions can change (e.g. new indexes, removal of indexes, partitioning of object, partitioning of indexes, “init.ora” changes that might affect SQL execution, addition and/or subtraction of server resources, etc.)
  • User must known how to obtain and read explain plans, and how to make SQL rewrites and/or database structure changes in order to improve the performance
  • Oracle 11g’s Automatic SQL Plan Management can often yield great results for far less (if not minimal) effort


So let’s examine how the newer versions of Oracle have totally eliminated such concerns.


Oracle 10g adds the new SQL Tuning Advisor – a utility to “Analyze individual SQL statements, and recommend SQL profiles, statistics, indexes, and restructured SQL to improve overall SQL execution performance.” Using my vacation analogy, the SQL Advisor is more like going to the American Airlines web site and booking airfare for the same trip. I’ll let the airline, pilot, navigator, tower staff, and air traffic controllers worry about all the details – I just have to say get me from point A to B for whatever price and travel time I am willing to bear. Likewise, we can now manually inform Oracle to worry about the algorithmic details churning inside – then we’ll just let the database know how much effort (i.e. time) to expend in order to meet our general SQL performance needs.


Thus Oracle 10g requires the DBA to remain engaged to generate, review and implement the recommendations, as shown below in Figure 1 (which is a slide from the Oracle 11g new features training class). Returning to the vacation analogy via an airline, I still have to manually input the origin, destination, dates, seat class and number of stops that I’m willing to endure. I can then choose from among the flight recommendations, but until I press the button for “purchase the ticket” – nothing real has as of yet actually happened. It’s all just “what-if” type analysis whose impact won’t be felt unless formally chosen.



Figure 1: Manual SQL Tuning in Oracle 10g


So for 10G, we’d look at OEM’s Advisor Central SQL Tuning Advisors, and specifically the SQL Tuning Advisor looking for any statements with excessive database times or IO. The AWR repository will automatically contain all the SQL Tuning Candidates and they will be broken down along their relative percentage contribution to the cumulative total, as shown below in Figure 2.



Figure 2: Top SQL by IO Tuning Candidates


The SQL Tuning Advisor will offer suggestions to improve performance – as well as an estimate of the percentage improvement, as shown below in Figure 3. Note too that it suggests more than just alternate explain plans (which is essentially just an internal SQL rewrite), but it also offers other critical advice – including database structural changes.



Figure 3: Tuning Advisor Recommendations


Oracle 11g adds SQL Plan Management which can essentially automate the 10g scenario in whole, as shown below in Figure 4 (which is another slide from the Oracle 11g new features training class). Returning once again to my vacation analogy via a flight would be like setting some American Airlines web site booking preferences – and automatically having the airline purchase those tickets that best meet all my needs (and all without any involvement from me). Yes – I’m giving up some control, but in the real world we call such scenarios excellent customer service. So why should the database and/or DBA offer anything less? And for those DBA’s who don’t want to relinquish such control or worry about the relative overhead involved, the last step can be either manual or automatic.



Figure 4: Automatic SQL Tuning in Oracle 11g and 12c


Of course in Oracle 12c this only gets better. Instead of just preserving repeatedly executed SQL statements as candidates, it now records and tests all SQL statements. I for one cannot wait to see what 12c R2 offers to improve this.