This content has been marked as final. Show 5 replies
Usually run Update Statistics at night time through a UNIX shell script for all critical tables with full detail including indexes.1 person found this helpful
If you are inserting or deleting voluminous data on Transact ion tables during day time, Run Stats through Application Engine or SQR. This will be with LOW mode since PeopleSoft suggest so.
Periodically the DBA should run a process to search for tables which requires reorganization.
Periodically review the Query Serve report (psqrystats) to find out which query took more time and analyze the tables under the Query.
If you're on Oracle 10 or 11, you can take a look at1 person found this helpful
pscbo_stats - Improving Statistics in Oracle RDBMS for PeopleSoft Enterprise [ID 1322888.1]
on Oracle Support. It includes a download with a stored procedure and documentation. Even if you decide not to use it, it should give you a good idea of what Oracle considers to be best practice for PeopleSoft databases.
We have implemented pscbo_stats procedure for a PeopleSoft HRMS 9.1 Oracle 11g database with good results. The only issue we have faced is the bind variable peeking issue which may require you to set the undocumented Oracle database parameter optimpeek_user_binds = false
I was the one who designed the pscbo_stats package to leverage the statistics collection techniques used in EBS and Seibel. (Mr. Sierra was the wizard who coded it.) I just noticed this thread and wanted to comment.
If you are using the pscbo_stats, the gather_schema_stats() procedure will by default gather only stale statistics, so it can be run regularly. I suggest weekly. You can follow the contents of the pscbo_log table for historical data for all (non-dynamic) stats gathering activities.
If you find there is a table that constantly stale and forcing you to think that the schema statistics should gathered more often, you may want to have two schedules - one weekly for schema, and one more often (daily?) to capture the more volatile objects excepting, of course, those that are already dynamically sampled.
There may be objects that are volatile enough that they should be added to the Stage Table Exception table. (see the stage_table_ins() procedure) We added that feature to configure dynamic sampling to work on volatile table - tables that are not working storage for COBOL and App Engine but are so often stale that they should be dynamically sampled.
I find that there are a few transactional tables that work well when dynamically sampled. For example, PSTREESELECT* tables can be problematic if they are dynamically rebuilt by a nVision report book - the end points can get stale quickly. They may be a good candidate for dynamic sampling.
Lastly, I am very interested to know user experience with the pscbo_stats package. Please post your comments to the communities.oracle.com forum for "Install/Upgrade PSFT".