I am trying to use Oracle database to access HDFS through DCH. Now I installed Oracle database 11.2, hadoop-0.20.2 and DCH on the same linux machine.
I also configured hdfs_stream properly and created Oracle directory HDFS_BIN_PATH according to README.txt file within DCH.
Now I encountered a problem, how I create a external table which access data residing on HDFS?
I figure out the way which access HDFS data from Oracle External Table through DCH as below, but still face some problems
Created Oracle directories as following:
CREATE OR REPLACE DIRECTORY hdfs_bin_path AS '/home/oracle/orahdfs/bin';
CREATE OR REPLACE DIRECTORY data_dir AS '/user/hadoop';
where /user/hadoop is HDFS directory in which data file test.txt located.
Created Oracle external table as following:
create table xqs_ext_test(col1 varchar2(100))
organization external (
DEFAULT DIRECTORY data_dir
ACCESS PARAMETERS (
records delimited by newline
fields terminated by ','
reject limit unlimited
when run SQL statement : select * from xqs_ext_test; the following error was prompted:
SELECT * FROM xqs_ext_test
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04040: file test.txt in DATA_DIR not found