Welcome to the Oracle Analytics Community: Please complete your User Profile and upload your Profile Picture
Comments
-
we don't do anything special for BIP report extractions, they run the exact same way. all the hash values are done on the local database side, not the cloud. we literally have all of the metadata for a report or a pvo in a table and simply generate tables and stored procedures to handle the mergers (along with some index…
-
I don't know about a use case, its just how we handle importing the extracted PVO datasets into our local copy. to elaborate: Let's pick a relatively simply pvo FscmTopModelAM.FinExtractAM.GlBiccExtractAM.BalanceExtractPVO we import that data every 2 hours by grabbing the csv/pecsv files off the ucm. For each csv file, we…
-
What we do is the following: Import the extract to a staging table and calculate a hash (sha-256) there. That hash along with the primary keys (things change if the keys can be null of course) are used in a T-Sql merge statement in our primary warehouse, so we are only updating/touching rows that actually have a changed…
-
Got mine yesterday.
-
Its posted on the first message of this thread. make sure you look on page 1
-
I got a notice that the voting on this idea is open, considering it was open, I'm not sure what that means, but I'm hopeful once more.
-
I don't think 24D information is available yet.
-
I haven't seen or heard anything on this idea.
-
@Bret Grinslade - Oracle Analytics-Oracle I would take this in any format I can get it. To me, it would be a highly useful feature. Do I now need to take this idea submitted nearly a year ago and put it into a new area to have it even looked at?
-
Hi there, to be 100% honest with you, I don't really remember. In general, we have pivoted over to regenerating the reports out of PVO's where this is an issue, however, one trick to try is on the data model, make sure the Include Parameter Tags option is enabled. That should force the headers to generate.
-
@Pietro Papaioannu I unfortunately still do not have a solution to this, though I will look at the doc you linked.
-
Just on the off chance that someone, somewhere will come across this and go, yes, thats what I need too... Here is what I eventually came up with WHERE LAST_UPDATE_DATE < CASE WHEN extract(hour from current_timestamp) >= 12 THEN (TRUNC(Current_Date) + interval '12' hour) else TRUNC(Current_Date) END your milage may vary.
-
If you get a pecsv data set (which is just the primary keys identified for your PVO), that is the current valid set of active records for that pvo. We handle it in this manner, we pull data rows (Delta or full depending on the pvo/day/situation, usually just Delta changes). these rows are bulk inserted into staging tables…
-
This is long needed. The BICC interface is fine, however it is not the ideal way to deal with things imo.
-
@Marcelo Finkielsztein Thanks for the comment, I'm having someone take a look, for the most part, I'm just the data transport person, I don't actually know what the RPD client admin tool is. It's just something I get asked about often.
-
Try R13.x Deprecated and New BI View Objects — Cloud Customer Connect (oracle.com)
-
The inability to alter an existing job with regards to a customized pvo seems to be a severe limitation on the usefulness of this api functionality. If I want to customize a base pvo, I put it in a job. If i want to schedule an extract of a pvo, it goes into a job. however, if I want to use the rest api to alter the filter…
-
This would have saved us a number of hours spent re-customizing the pvo's after we split the jobs up.
-
it would allow better programmatic access to handling jobs in the BICC. while the bicc is good, its not ideal for large job setups.
-
BIP has limits as well, but I don't know the tool you referenced. One major limit of BIP reports is a hard 2 million row max limit that Oracle seems to not want to lift.