This content has been marked as final. Show 8 replies
After reading the description from "Lost Messages", I came to know what is actually happening in the publish/subscribe of messages but still there are messages which never get enqueued in the Hub. I have to manully stop and start the DB adapter for this. Helpful pointers are really welcome :)
I assume your triggers on the publishing app (APP1) is calling both the procedures created from iStudio. First to create the message and the second to publish the message to the hub?
You can also try stopping the adapter and deleting the persistence directory starting the adapter and push the meta data again. This should clear the cache.
Hope this helps
Thanks Craig! for you help is DB Adapter. I was able to get thru!
I am now exploring the FTP Adapter for an Integration scenario with a file based legacy system Do you have any helpful pointers on this.
Again clearing the cache by removing the persistence directory and re-starting the FTP Adapter applies here too.
If you are trying to process multiple data records then make sure that your DTD is defined to allow multiple occurrences of the element in question.(The Hub can only process one message at a time therefore you need to work in arrays of data) Also ensure your common view and receiving application view are defined as arrays (you will need to write some PL/SQL to loop through each element of the array at the receiving end).
Hope this helps
Thanks Craig for your nice adive.
My scenario is as follows:
Two systems (One on Oracle and the other File based) have to be in sync. Whenever a record (or bulk records) are created on either system, the data must be propagated with little or no delay to the other systems corresponding table (or file).
Kindly provide your fruitful advice! Really waitning for a valuable feedback
Hi Mustafa Jahangir,
I need a help....Could you plz give me a detail description abt how can I communicate two databases.
I had gone through the training matterial and in iStudio I made some medeling,but I dont know how I can practicaly check that.If possible plz send me a practical scenario with detailed description of how to integrate two databases.
I cannot speak about your file based system, as I do not know how your messages on this side are created (published) or consumed (subscribed).
However, I am working on a site where we are integrating 12 systems to an Oracle database. These systems are a mixture of Oracle, Access and file based.
In one example we are sending an XML file containing 17,000 elements, from an FTP Adapter, through to an Oracle DB Adapter. This passed from file system, through to an Oracle Database in 4 seconds! (That was with full logging on)
The things we have done in order to ensure a high throughput are
(a) switched adapter logging down to "errors only" (i.e. In Adapter.ini 'agent_log_level=0'),
(b) in the FTP Adapter log file, reduce the polling interval from the default (60000 milliseconds [60 secs]) to a lower number of seconds, say 10000 [10 secs].
In terms of application side message creation, you can obviously use triggers (with care) to call your message creation scripts, or change Forms functionality using custom.pll to kick the process off. I actually favour the use of scheduled concurrent programs to manage message creation. This may work particularly well if, as I can see, you are doing bulk records. Here you can "sweep" up an group (by using arrays) your records into one bulk transaction.
For Oracle side message consumption when using a DB Adapter, the way you write the "iStudio Generated Code" will obviously determine what happens to the data inbound into Oracle.
I hope this helps
As a quick check, when you start your DB Adapters, does the oailog.txt file show any connection errors to either to target databases, or the hub?
The only way you can really test is to actually test it.
Send me a mail directly to email@example.com if you need a simple example.