If you go to the extent of using BPEL, rather then perl or some such, then you need to go all the way.
I assume here that you are using OpenESB, GlassFish ESB, of Java CAPS/JBI, that is BPEL 2.0 environment.
Define one XML Schema document whose input message structure corresponds to the input CSV and one whose message structure corresponds to the output CSV.
Define custom delimiters on both structures, to be used by the Custom Encoder
Create WSDLs for the inbound and the outbound side and configure them to use the custom encoder to "decode" CSV to XML on the way in and to "encode" XML to CSV on the way out.
In BPEL, map from XML to XML.
There are a number of tutorials on encoder aspects of teh JBI kit. Even one from me: http://blogs.sun.com/javacapsfieldtech/entry/java_caps_6_jbi_note2
Hi. Yes that is an option we have looked at.
However we were having performance issues and making xml objects for each element in the flatfile will give us serious issues. Remember, there could be tens of thousands of records.
Fair enough. I am no great fan of XML myself.
I assumed you are reading a multi-record file using the File BC, having File BC break up the file into records and getting each record delivered to a separate BPEL instance. This would be a "normal" way of dealing with multi-record files where each record needs be processed individually.
I am very curious as to why you are trying to use BPEL for this. I would consider doing this in Java, if I have to deploy the solution to the application server. Since you seem to be expecting to get the whole payload as a blob and process it, there does not seem to be anything else the BPEL process would be doing, so why use a BPEL process in the first place? Or am I interpreting what I think you said incorrectly?