This content has been marked as final. Show 2 replies
With the info you gave I can tell that choosing the tool is not your concern. Yes, Interconnect can do this.
Be sure that you have decided if the exchange process is a message-based or a batch based process. The first one is a typical Interconnect solution, the batch-based isn't. Sending lots of data at once might influence the performance of the hub.
What you should focus on is the data itself.
First of all choose the source application/database for the address-data.
Make sure you know the data structure in that db.
Also define the events that should trigger the messages. Can they easily be recognized (a trigger, a save, ...)
Than define your common data-structure (model) for your information-object as it should be exchanged between all applications. keep it simple but as generic as possibble. Think about ubnique keys!
Per receiving application make sure you know the data-structure, how to proces your data.
And last but not least think about the consequences of inconsistencies! Maybe the data should be cleaned before starting to exchange. Inconsistencies may not be a problem within 1 db, but could create problems in others...
If you want more details let me know, maybe contact me by e-mail personally.
In our company we have used Interconnect for 2 years now.
Thanks for answering, put a couple of things straight in my head :)
Seems however that this part pÃ¥ the integration project have been put on hold, since the differences in the way each institution uses the system implicates difficulties that requires further research before planning a solution...