This content has been marked as final. Show 2 replies
For me, Business Objects are logical groupings of business processes. For example, we have a Business Object called "Maintain_Employees". Under this we have 1 Procedure (Create_Employee) and 2 Events (Update_Employee and Delete_Employee).
We have 1 Oracle system interfacing with 23 other legacy systems. Some of these legacy systems will be using this "Maintain_Employees" Business Object (Common View), and our main transformations will be between the Common View and the legacy Application Views.
We are using a number of techniques to assist in "validating" data in the InterConnect. The main ones are using 'Cross Reference Tables (XREF)' and 'DatabaseOperation' transformations. By using 'Content Based Routing' we are able to send the right message to the right legacy system, and therefore do the right transformation/validation on the message payload. However, this is only a small part of a complex puzzle.
I also have the "problem" of having "very complex SQL" on our Oracle system too. This is not unusual when using the InterConnect.
To my mind, the InterConnect does 2 main operations. Firstly, it performs some message transformation (mapping), and secondly, it acts as a transportation engine (routing) using the adapters.
The remainder of the effort required to create or consume the message resides with the Applications themselves. Whether it is parsing an XML CLOB payload, inserting data into staging tables, writing to log files, pre-processing data, calling API's or something else, your Application side programming and processing overhead can get large.
The trade off it to ask the question, do I want to be able to track and manage messages from start to finish in high detail? Or can I trust that all message payload data will be consumed with no additional processing on the Application side?
My experience has shown that the bottleneck is always at the Application side, and almost never in the InterConnect.
The short answer to your first question is "You are right. Mappings can take place only between Application Views and Common Views only - not between Business Objects.".
To answer your second question "Probably everyone reading this forum has this problem. The intelligence that is able to really interpret message data, validate it and process it is only found in the Application, not the InterConnect. You could, however, use the Workflow engine within OAI in order to provide additional pre-validation, human interaction and logic, but this too could be complex."
At my current client, we are architecting an Application OAI Message handling schema. This will contain staging tables, pre-processing tables, "OAI" wrapper PL/SQL scripts, "APPS" wrapper PL/SQL scripts and Message Logging and Exception tables. Ours will be a complex set of PL/SQL processes too.
I hope this helps, just in letting you know that you are not alone with this problem.
I wonder if anyone else would like to share how they have architected their InterConnect and Application side mapping and transformation solutions.
but I find out that other EAI products can do this ,that is ,do very complex transformation and mapping between different business objects,so the workload and pressure will be hold on the EAI engine rather than on the applications system which need to be integrated.and many ETL tools can also do this ,so I wonder IF there exist some methods in interconnect can solve this probelm which
many people will meet when do a enterprise integration.
secondly ,I have read the PROCESSCONNECT doc ,It says it can do TL when do data integration,so I want to have a try with PROCESSCONNECT which is not simple and widely used as INTERCONNECT,anyone can me some suggestion?