Categories
- All Categories
- 15 Oracle Analytics Sharing Center
- 19 Oracle Analytics Lounge
- 222 Oracle Analytics News
- 44 Oracle Analytics Videos
- 15.8K Oracle Analytics Forums
- 6.1K Oracle Analytics Idea Labs
- Oracle Analytics User Groups
- 83 Oracle Analytics Trainings
- 15 Oracle Analytics Data Visualizations Challenge
- Find Partners
- For Partners
OAS DV Data Set and Data Flow Schedule Details

Hi Team,
Could you confirm whether the OAS BIPLATFORM repository database stores schedule and run-status details for DV Data Sets and Data Flows?
I’m specifically looking for tables that show:
- Which schedules exist
- Planned/actual start & end times
- Success/failed status and any error messages
If available, please point me to the exact table names and relationships (for both schedules and historical runs), and any known differences across OAS versions.
Thanks in advance,
Dilek
Answers
-
Hi,
I didn't check these details directly because didn't have a need so far.Assuming your RCU schemas are on an Oracle database, you could enable audit logging for those schemas (just BIPLATFORM should be enough, the other schemas are supposed to be for other things) and you will be able to see what happen when you schedule something and when you execute something.
That's the easiest way to find the answer, because I'm quite sure this isn't something documented. A number of tables in BIPLATFORM have proper foreign keys defined, that should give you the relationships. But looking at the audit logs of your database will give you all the details of every query generated by the product.
That's how I found a number of undocumented answers to my questions in the past.
It's something your DBA should easily be able to help you with.
0 -
Hi Dilek,
To Gianni's point, if you are only querying, you should be OK.
If you try to manually manipulate the data in the tables, you may break things and break your ability for Support assistance.
Maybe you could provide insight as to what your goal is, why you need the information, why the UI/API/CLI options are not adequate.0 -
I think it would be great if you could get this OOTB. Appreciate in Usage tracking in OAS there are some DV type objects of visual analyser "Dim - Logical SQL"."Query Type" = Visual Analyzer but its limited.
We were able to query classic agents runs and also BIP schedules but we had to query the BIPLATFORM.CSS_FILE_CONTENT BIPLATFORM.CSS_SI_FILES and work out what useful information was in there we could use.
Having one place where you could query all OAS process would be nice rather than multiple places in the product for an admin point of view though appreciate self serve by user is the direction of travel.
2 -
Tim, sounds like it would make a fantastic Idea Lab request (enhancement ), if there is not one already.
1 -
Dilek, it seems to be quite "messy" with lot of places where things happen…
For example, when I reload a dataset based on a query, what seems to be giving me some useful information is looking in
W_ETL_RUN_OBJ
by using theobject_id
column, and from there taking therun_wid
column value to queryW_ETL_LOG
withobj_wid
equal the previous run_wid, and there thetextfield
gives me all the details of the reload. There is alsoW_ETL_RUN_SOBJ
that seem to be related as well, looks like the same thing, no clue what the difference is (welcome to BIPLATFORM, the place where you shouldn't be…).W_ETL_LOG
is a "fun" one because thetextfield
is updated all the time during the execution of a process appending the new log text to the exist value. Therefore it works better when checking finished jobs to not having a moving value all the time you query it.There definitely must be a better way to get just the info you look for, because that's what shows on screen when you inspect your dataset. But at the same time there seem to be so many tables involved in everything that it is maybe not really straightforward. Looking for an API is maybe a safer option as it will do the whole job and collect all you need in a single request.
As said, if you want to dig deeper just create an audit policy for all the actions, enable it only for the BIPLATFORM user, and then you filter things when querying the audit logs.
With some observation of the patterns, excluding all the noise you will capture (lot of schedule process running all the time in the background making lot of noise), you will identify the logic of things happening.
0 -
Talking about APIs…
/dv/ui/api/v1/dataset/schedules?id=<dataset id>
⇒ should give you the schedule details of a dataset/dv/ui/api/v1/dataload/jobs/dataset?objectId=<dataset id>
⇒ gives you the history of the loads with all the detailsIf you check your browser developer extension when looking at these info in the web interface, you see these Fetch/XHR calls executed by the webpage. There you have all the APIs you need.
0