1 Reply Latest reply: Mar 7, 2012 1:22 PM by alwu-Oracle RSS

    analyze and stats and DBA actions

    796925
      Hi

      one of my customers had issues with when our code was calling the graph.analyze() function, and they wanted the overhead related to this to occur under DBA control.

      Their DBAs already have Oracle doing some statistics etc work in the night as part of their regular DBA activity (not motivated by the spatial package).
      Looking at older threads on this forum there seems to be operations like semperf.gather_stats that maybe the DBA needs to explicitly add to what gets done during the night??

      The older threads on this topic seem to be offering advice to the developer of a semantic solution, rather than to the DBA responsible for maintaining a database that, amongst other things, hosts a semantic solution. I need to offer advice to the DBA working for one of my customers, and I am unclear how to reframe the documented procedures for such an audience

      thanks

      Jeremy
        • 1. Re: analyze and stats and DBA actions
          alwu-Oracle
          Hi Jeremy,

          graph.analyze can take a few arguments (including estimatePercent, dop) to reduce the time to perform statistics collection. We recommend running graph.analyze only after a significant amount of data have been modified (or inserted) into a particular graph.

          In your case, if the overhead is still significant (after using dop for example), then the DBA can schedule sem_perf.gather_stats periodically. Note that sem_perf.gather_stats can be run in parallel as well.

          e.g.

          sem_perf.gather_stats(false,8) will gather statistics of all graphs in the semantic network with a parallel degree 8.

          Alternatively, the DBA can schedule dbms_stats.gather_schema_stats() against MDSYS with options set as GATHER AUTO. With GATHER AUTO, Oracle implicitly determines which objects need new statistics, and determines how to gather those statistics.

          Your DBA may also want to save "good" statistics (by using DBMS_STATS.EXPORT_SCHEMA_STATS, or EXPORT_TABLE_STATS) so that if the new statistics cause surprises, the old statistics can be imported back.

          Hope it helps,

          Zhe Wu