This content has been marked as final. Show 7 replies
Looking towards your information it seems like it's trade off among time consumption, resource utilization, standard orchastration, design etc..
select the one which is optimal for your specific case.
Also, it seems like batch process. So, you can apply general best programming practices for batch processes.
Edited by: user6014009 on Jan 13, 2010 1:06 AM
You use the phrase 'business process flow', I think that the process may have an impact on your design. I would personally lean towards BPEL if there are requirements for error processing, or multiple steps involved in the business process. If it's primary focus is on determining existing customer, create a service that is abstracted from your application and call it from the service bus.
As a guideline, maybe not completely answering the question but related, we have established an internal recommended practice that if the component being built is not completely restricted to the application and the single use beind defined, we stay away from PL/SQL. If there is any potential of reuse, any variance in security model, or any need for error processing we abstract the component away from the application and service enable it.
Hope this helps!
Here are my 2 cents.
You are missing a timing factor. What's the timeframe requirement on loading/updating the customer records. If your processing can't not catch up the customer records' generation rate, then you get no choice and have to go for the second choice. Which you can do bulk load into the DB and do whatever you need to transform the data.
I have done many such integration/migration projects. Most of the time, I can reuse the PL/SQL to load the data from pre-defined temp tables into our product tables. The only gap I need to cover for different data source is to write perl script to transform them into the text file my sqlldr can load into the temp tables.
But as addressed by the above, it depends on your business need, if it's just one off thing, then stay away of the PL/SQL because you need to do many testing on the data integrity. Otherwise, try to reuse what you currently have , such as BPEL, SOA, because you can save from doing any retesting.
Batch processing has been mentioned above as a solution. Although I know it is used quite frequently I am always trying to work around that. The main reason is that the response will establish a separate communication channel in which you have to match it against the original input.
You get 1000 new customer records in from a flat file.
You hand this over to the database and process them with some PL/SQL.
Now of those 1000 new records 5 were incorrect.
Your PL/SQL will then need to return the records and the associated error codes to the caller.
Problem is that there is no caller (requestor) as this came in as a flat file (probably using FTP).
So splitting this up into 1000 requests in the first place makes at least your error handling easier.
I had the same requirement few days back, just instead of Customers I had the Quotes coming in the flat file (CSV file). I implemented both the approaches but the trade offs were more in the first one. Following the second approach I created a wrapper API that would do the validation and insertion/updation. In case of any errors, the combined error message for all the erroneous records appended with new line character (char(10)) was returned back to the BPEL process that was further sent as an email notification to the desired person. The approach was very much suitable for the batch processing as compared to the earlier one.
Also ODI is the best approach for accomplishing this kind of requirement but you need to evaluate the cost and infrastructure for the same.