Oracle Transactional Business Intelligence
BI Publisher Report as Cloud Data Extract engine
Use of Report as Cloud Data Extract engine and keep track of extracted data/ record
Cloud Integration process where data is extracted and sent via FTP to other partner applications.
If this process is performed on a scheduled basis (or manual run) then it becomes difficult to find out which records were extracted earlier and where are NOT? Other mechanisms are used to remove already extracted records from being sent twice.
Could we perform the following steps:
#01: BIP Report is given the capability (similar to EBS) to update some DFF fields (not other fields) from the source tables (data being pulled) to update and store them so that next run could validate and exclude these records. This should be kept as datatime field so that it can keep track of when the data was pulled. In addition, if the record was updated then the record should be pulled second time in BIP. This could be performed on Datatime and "Last_Update_date" comparison.
#02: ATOM configuration could accommodate custom BI Publisher reports so that one can create reports and send via ATOM audit structure to FTP / emial
Did you ever figure out if #1 was possible at all? Our company is looking for ways to extract small to large data sets using BI Publisher extract(and External Report Webservice) to fit our specific business requirements. The ability to update a DFF field when pulling data would make it a lot easier to incrementally pull data that hasn't been pulled yet.
I am also looking for the same solution. At one of our customer, we need to build the report that can fetch the delta records from the previous run? it makes more difficult since last_update_date is not index. Can you please help regarding this?