This content has been marked as final. Show 3 replies
I figure out the way which access HDFS data from Oracle External Table through DCH as below, but still face some problems
Created Oracle directories as following:
CREATE OR REPLACE DIRECTORY hdfs_bin_path AS '/home/oracle/orahdfs/bin';
CREATE OR REPLACE DIRECTORY data_dir AS '/user/hadoop';
where /user/hadoop is HDFS directory in which data file test.txt located.
Created Oracle external table as following:
create table xqs_ext_test(col1 varchar2(100))
organization external (
DEFAULT DIRECTORY data_dir
ACCESS PARAMETERS (
records delimited by newline
fields terminated by ','
reject limit unlimited
when run SQL statement : select * from xqs_ext_test; the following error was prompted:
SELECT * FROM xqs_ext_test
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04040: file test.txt in DATA_DIR not found
who can help me out with this problem?
Edited by: user1745344 on 2012-5-22 上午1:02
Ok, silly question, but does the file exist there? Also, just to check, do you have the db patches/code installed?
this problem got solved.