I have an use case in my current project where i need to read a csv file and need to
replace a column value ( item number ) with corresponding ERP fusion item value by
making ERP webservice call. for each record in a file i am making a call to ERP SOAP webservice to get the itemvalue.
If the file contains more than 1000 records its unable to process it and soa service getting timed out. As of now i could see
two approaches one using file chunk read and the other is to insert all the file records in DB and process it further.
currently ODI is not used in our project.
is there any better approach to given use case ? if so please share it.
Indeed, I think you should at least read it chunked.
But reading the file into the database and then process it further, using a polling adapter is also a good choice. That option will enable you to process the rows in parallel by multiple nodes. And at a fault, only the row that faults is rolled back.
I guess I prefer the database option, although it is a bit more work.