Juste a (big) question,
we want to migrate our application from (linux code-grep, cut, wc.....cammande) to JAVA or PRO C
the application takes on input flat files (4 flate file per minutes) each file containes 100.000 lines to reformate with grep command and then load them into a partitionned database 10g
what do you suggeste me you experts to implement for the best use of my database
solution 1 : the use of java code rather then linux code
solution 2 : the use of C /pro C code rather then linux code and java code
solution 3 : the use of external tables in oracle (consider my flat files as external table) and contain do all the trasformation with native SQL code rather then linux code and java code and C code
I realy need your help,
we don't want to use teradata cause of the expencive price, and we ask me to make optimum developpement
Tks in advance
First of all, on what platform this project run?
I ask because 100.000 records can be a lot ("little" server) or not ("big" server).
If speed if a big problem, I would not use java: C/Pro*C works better.
Personally I would suggest to be "Oracle-centric": so, use external tables (or sql*loader) and PL/SQL or SQL but the checks made bye the "linux code" must be reviewed closely.