I created a script in dev in Oracle 10g R2 using dbms_scheduler and the script ran successfully. I was able to verify the logs in all_scheduler_job_run_details in dev. However when I moved the script to QA environment and executed the same, the following happened: A chain step in the script to update request to WIP was executed successfully. I was able to verify the status from table. However I did not find any logs in all_scheduler_job_run_details in QA nor did I find any details in all_scheduler_programs or all_scheduler_chains or all_scheduler_chain_steps. Is there a possibility that someone like some DBA might have turned logging off deliberately in QA or is there any other scenario or situation due to which this might have occured? I am executing an external job too and it executes fine in dev. However in QA, the external job is not executed. I confirmed that CREATE EXTERNAL JOB privilege is granted to schema in QA?
For the external job , is it successfully created without errors ? Does the run_count of dba_scheduler_jobs get incremented when it is supposed to run ?
If run_count is incremented then the job is running but not getting logged. To check why it is not gettign logged
- make sure the SYSAUX tablespace is online and has space available
- check the logging_level for the job and for the job_class that it points to
One thing to do would be to check a simple one-time pl/sql block job to sere whether that runs and logs.
Hope this helps,
In dev, the external job ran successfully and the essmsh errors(related to Hyperion Essbase) appeared in additional_info column. I captured those errors from all_scheduler_job_run_details. In QA, there are no logs in any of the dbms_scheduler tables including all_scheduler_jobs. So I think there is no run_count either. I don't have access to dba_scheduler_jobs in QA and I guess all logs in dba_scheduler_jobs should also appear in all_scheduler_jobs. If there are any privileges issue, I guess there should have been errors while executing the package. There are no errors. The plsql package is executed successfully.
Should I check the logging level by executing package dbms_scheduler.get_attribute(job_name, attribute_name=>'logging level',value out varchar2);?
That's really weird. If a user successfully executes dbms_scheduler.create_job then there should always be a row in all_scheduler_jobs (except for a run-immediate job that drops itself immediately).
You could definitely try dbms_scheduler.get_attribute(job_name, attribute_name=>'logging level',value out varchar2) but this should fail if there are no rows in all_scheduler_jobs since no rows in all_scheduler_jobs means that no jobs are accessible to the current user.
I think you might be better off figuring out why all_scheduler_jobs is empty since it defintely should not be. Querying dba_scheduler_jobs might help since that is guaranteed to list all jobs (all_scheduler_jobs only lists jobs that are accessible to the current user).
I have set the autodrop to true for job. Will that make any difference? I will check all_scheduler_job_run_details in respective schema and also dba_scheduler_job_run_details if possible.
I found out the issue. I misinterpreted the purpose of all_scheduler_job_run_details and other dbms_scheduler tables like all_scheduler_jobs. I thought all refers to all jobs created by all schema. So I was checking for programs, chains, chain steps and job entries from a different schema and hence did not find any entries. So all_scheduler_job_run_details pertains to jobs created by that particular schema alone.
Glad you figured it out. Just to clarify
all_scheduler_jobs is a list of all jobs that are visible to the current user. If the user has the CREATE ANY JOB system privilege then the view will list all jobs. Otherwise it will list jobs that are owned by the current user and jobs that the current user has the ALTER privilege on.
Other all_scheduler_* views are similar in that they show information for objects that are visible to the current user.