1 Reply Latest reply on Jan 17, 2006 9:27 AM by 416900

    Interconnect and batched data

    440061
      I'm new to Interconnect, and need some guidance.

      We are creating Interconnect links between all of our systems. Some will be 'real-time' - i.e. as changes happen in one system, they trigger Inteconnect messages which subscribed sytems pickup and process. And that's works well.

      However some of our feeds are batches. For example, each night, one system expects a file containing all changes that have happened during the day.
      I'm not sure what the best way to achive this is. I can see I should use the FTP/File adapter but I'm not sure of the best way to generate the batch.

      Is there any way to 'stockpile' messages, and then at a given time, combine them into a single message, and publish it?

      Or should I get the triggers to write changes to a temporary table, and have a scheduled job to generate a message with an array of changed records at a given time.

      Or should I be using invoke events some how?

      Any help much appreciated.

      Robin
        • 1. Re: Interconnect and batched data
          416900
          Robin,

          There are many ways to trap changes in the database - triggers, form customisations, user hooks etc etc. Whichever way you want to find all the new, changed or deleted data you could simply put it in a staging table, then publish messages from this table periodically.

          Do you have Oracle Applications at your Source end?
          You could use Concurrent Programs (or Request Sets) to manage and schedule the generation of messages at the end of the day.

          I have used Concurrent Programs, and Request Sets, to call "Extract" and "Publish" pl/sql procedures. These procedures will get all the data that has changed over the last 24 hours, then you can create "array" type messages for batch processing.

          To find out the last time my "Extract" concurrent program ran successfully, I examine the actual_start_date in the view fnd_conc_req_summary_v.

          e.g.
          SELECT nvl(MAX(actual_start_date), SYSDATE)
          FROM fnd_conc_req_summary_v
          WHERE user_concurrent_program_name = &p_conc_program_name
          AND status_code = 'C'
          AND request_id != &p_request_id;

          I use this returned date, and set it as my "FROM DATE".

          My "TO DATE" is set again by examining the actual_start_date in the view fnd_conc_req_summary_v.
          e.g.
          SELECT actual_start_date
          FROM fnd_conc_req_summary_v
          WHERE request_id = &p_request_id;

          When I run my "Extract" programs, I use these set "FROM" and "TO" DATES, with the source data tables last_update_date column to identify the changes in order to build the batch file.

          Unfortunatley, you can only "stockpile" the raw data. Once the message has been created, it is difficult to change or add other messages to it.

          As an aside, don't think that you have to use just the FTP adapter if you want to create "array" / batch messages. You can use the DA Adapter too (8.1.7 and above).

          Simply design the publish side of the message in iStudio as a tiered "array" type, and iStudio will generate all the basic code for you.

          Have a look at this thread for more information...
          Re: Publishing master -detail info from database(DBadap) , ending in xml file

          HTH
          Yan