6 Replies Latest reply: Feb 27, 2013 12:45 PM by marksmithusa RSS

    Affect of Gathering Statistics on Performance........

    user9085814
      Hello All,

      When schema/table statistics need to gathered and why ?

      In Pick business time or in non business hour
        • 1. Re: Affect of Gathering Statistics on Performance........
          993727
          It's useful for explain plans/execution path. Off peak time.
          • 2. Re: Affect of Gathering Statistics on Performance........
            Rob_J
            Hi,

            There are many different approaches to [url http://www.ora00600.com/scripts/statistics/gathering_statistics.html]gather stats for schemas and tables. My thoughts are:

            1. You want to ensure stability on your PROD DB so fully test and stats changes before rolling them
            2. Gather using a 100% sample rate for full accuracy, on a DEV environment and export those stats to use for testing and importing to PROD when signed-off
            3. Gather new stats when you think you need to, such as when large data volumes change, columns change, etc

            Just remember that there is no 100% right way to do this. Everyone will have their own opinion based on what they have seen and used before. Make sure that you test any change is what I would say. If on 11g you could use SQL Plan Management to minimise risk when changing the stats.

            Rob
            • 3. Re: Affect of Gathering Statistics on Performance........
              Dom Brooks
              You want to ensure stability on your PROD DB so fully test and stats changes before rolling them
              Not gathering stats does not necessarily equal stability :)
              If on 11g you could use SQL Plan Management to minimise risk when changing the stats.
              One of the factors here is to have a nice, long AWR retention.
              Then you have a nice big window to get previous plans, at least for your biggest hitting SQL and then use baselines or sql profiles via coe_xfr_sql_profile script to lock in previous plan.
              • 4. Re: Affect of Gathering Statistics on Performance........
                Rob_J
                Dom Brooks wrote:
                You want to ensure stability on your PROD DB so fully test and stats changes before rolling them
                Not gathering stats does not necessarily equal stability :)
                Yes, you're right. However, changing stats directly on a production environment, in my opinion, is asking for trouble. I've seen it happen too many times where a stats gather job runs one day and the next morning there are plan changes causing issues. I don't like getting woken up too early so I prefer to make sure no stats changes are going to PROD without testing :)
                If on 11g you could use SQL Plan Management to minimise risk when changing the stats.
                One of the factors here is to have a nice, long AWR retention.
                Then you have a nice big window to get previous plans, at least for your biggest hitting SQL and then use baselines or sql profiles via coe_xfr_sql_profile script to lock in previous plan.
                • 5. Re: Affect of Gathering Statistics on Performance........
                  Dom Brooks
                  So if you set the stats job to run at lunchtime, once you've had your sandwich, you're in the perfect place to start fixing all these issues. No? :)
                  • 6. Re: Affect of Gathering Statistics on Performance........
                    marksmithusa
                    With SQL Plan Management, I found that it can really screw up your database performance if you have queries which don't make use of the bind variables.

                    I found this out when I enabled it for our Production Exadata machine: most of the queries are well-coded, boxed queries which make use of bind variables perfectly.

                    However, there was one application (a .NET one) which simply refused to use them. Every time it executed (and it executed a LOT), it would use dynamic statements with literal values. This caused the Plan Management to BALLOON from something reasonable to 600 billion rows. At the start, no-one noticed and people liked the plan stability, but then the whole database started to crawl because every statement referenced the HUGE table.

                    Needless to say, the coders of the application wouldn't budge from their insistence on literal values because THEIR code was dynamically generated from a 'core' which had no understanding of them. I was very disappointed when I had to turn it off as it's a great feature. Bad coding beats great features anytime, though.

                    As for gathering stats, I've found that the automated job which runs in the maintenance window does a fairly good job. We haven't change the estimate_percent defaults and sometimes we have to compute statistics on a couple of awkward tables (i.e. estimate_percent at 100) manually. But it's pretty solid.

                    If you wanted to know what tables have 'stale' stats, you can query dba_tab_statistics and look for the stale_stats column.

                    Mark