External CSV file: embedded line terminators, transforming columns, and bad/log/discard files — oracle-tech

    Forum Stats

  • 3,715,830 Users
  • 2,242,890 Discussions
  • 7,845,632 Comments

Discussions

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

External CSV file: embedded line terminators, transforming columns, and bad/log/discard files

Oracle 19c

Creating an external table against source CSV file with over 450 columns, of which I need maybe 40, scattered throughout the file.

Some columns contain embedded linefeeds. I'm currently declaring input as:

 ACCESS PARAMETERS   (fields csv with embedded    )

Three questions:

  1. With the "csv" spec, It seems that I cannot specify bad/log/discard file names and locations here, can I? I can't seem to alter the table and specify where these go... for example: 

alter table my_csv  modify access parameters (badfile MY_DIR:'my_csv.bad');

Correct?


2. My understanding is that with the access parameters that aren't "CSV", there's no way to handle embedded line terminators, except perhaps by pre-processing the file.

True?

3. With the CSV spec, it appears there's no way to skip columns or use column transforms to convert them to null (saving on troubleshooting for have to read 450+ columns correctly every time). Also true?

Thanks for insights. Trying to decide next steps.

-John

Sign In or Register to comment.