I'm getting a *"No more data to read from socket"* exception when selecting from an external table I've just created. I think I've located the issue, but the solution should be avoided at any cost. Let me explain:
CREATE OR REPLACE DIRECTORY foo_data as '/home/foo/bar/Data';
CREATE TABLE "foo"."EXT_TABLE02" (
"SOME_DATE" DATE )
default directory foo_data
( fields terminated by ','
"SOME_DATE" date 'yyyymmdd'
select * from ext_table02;
drop table ext_table02 purge;
$ sqlplus -s foo/bar@foobar @createExtTable02
$ sqlplus -s foo/bar@foobar @createExtTable01
select * from ext_table01
* ERROR at line 1: ORA-03113: end-of-file on communication channel Process ID: 18042 Session ID: 203 Serial number: 41103
ORA-03114: not connected to ORACLE
My best guess is that the external table doesn't like the double quotes around the timestamp in the csv file. I've tried surrounding the format with single quotes:
'"SOME_TIMESTAMP" CHAR(30) date_format date mask '"yyyy-mm-dd-hh24.mi.ss.ff"'
In this case the query ends, but it gives an error saying that the literal doesn't match the format.
The easy solution might be removing the double quotation marks from the input data set, but I don't want to do that since it's the data I received from our client.
P.S.: I'm using 220.127.116.11 with the latest PSU (JUL 2012?)
I've tried to remove the double quotation marks and I'm getting the same error. It must be something else, still related to the timestamp.
I've removed the decimal part after the seconds and now it works properly, but I think this not an acceptable solution for the customer.