We currently receive incident extracts from ServiceNow via email in xlsx format for all incidents in our departments and manually upload these extracts into Oracle Analytics datasets. These datasets are then used to visually represent some key department metrics in an Oracle Analytics DV 7.6 workbook. We are now planning to automate the data ingestion process by using REST API connection available in Oracle Analytics DV 7.6 to load data directly from ServiceNow into an Oracle Analytics DV dataset.
However, we are facing a challenge because the volume of data is too large to be ingested through a single API call. Our proposed solution is to perform a one-time historical load for inactive incidents and then only load incremental data for new incidents or any updates to existing incidents going forward.
While we can easily identify new rows of data (i.e., new incidents) through our dataflows, the problem lies in tracking and loading the changed data since the last historical load. Specifically, we need to determine how to identify and capture modifications made to incidents that were already ingested before the historical load, to ensure our dataset stays up to date with all changes.
Is there a way to replace the existing incidents with new incidents when there are any status changes through dataflows, or would there be a better solution or approach for managing this incremental and changed data load effectively?