- 3,715,830 Users
- 2,242,890 Discussions
- 7,845,632 Comments
External CSV file: embedded line terminators, transforming columns, and bad/log/discard files
Creating an external table against source CSV file with over 450 columns, of which I need maybe 40, scattered throughout the file.
Some columns contain embedded linefeeds. I'm currently declaring input as:
ACCESS PARAMETERS (fields csv with embedded )
- With the "csv" spec, It seems that I cannot specify bad/log/discard file names and locations here, can I? I can't seem to alter the table and specify where these go... for example:
alter table my_csv modify access parameters (badfile MY_DIR:'my_csv.bad');
2. My understanding is that with the access parameters that aren't "CSV", there's no way to handle embedded line terminators, except perhaps by pre-processing the file.
3. With the CSV spec, it appears there's no way to skip columns or use column transforms to convert them to null (saving on troubleshooting for have to read 450+ columns correctly every time). Also true?
Thanks for insights. Trying to decide next steps.