Forum Stats

  • 3,783,390 Users
  • 2,254,767 Discussions


Load JSON from HDFS into ORACLE via OSCH

967981 Member Posts: 2
edited Jul 19, 2014 4:50PM in Big Data Connectors/Hadoop


I'm trying to find a way to load JSON-Data stored in HDFS into Oracle Database via Oracle SQL Connector for HDFS (OSCH).

The most promissing way seemed to my to set up an External HIVE table and create an external oracle table on top of it with the oracle hdfs external table tool:

add jar hive-json-serde-0.2.jar ;

create external table owners_test2 (

  col1 string, col2 double, col3 string



  serde 'org.apache.hadoop.hive.contrib.serde2.JsonSerde'

LOCATION 'hdfs://hadoopHDFSCluster/my_json_file_folder' ;

hadoop jar /<path_to_orahdfs>/orahdfs.jar oracle.hadoop.exttab.ExternalTable -conf /<path_to_my_conf>/table_conf.xml -createTable

But i'm getting the error:

2014-05-12 16:41:25,927 INFO [main] metastore.HiveMetaStore ( - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore

2014-05-12 16:41:25,955 INFO [main] metastore.ObjectStore ( - ObjectStore, initialize called

2014-05-12 16:41:26,178 INFO [main] DataNucleus.Persistence ( - Property datanucleus.cache.level2 unknown - will be ignored

2014-05-12 16:41:26,697 INFO [main] metastore.ObjectStore ( - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"

2014-05-12 16:41:26,732 INFO [main] metastore.ObjectStore ( - Initialized ObjectStore

2014-05-12 16:41:27,745 INFO [main] metastore.HiveMetaStore ( - 0: get_table : db=default tbl=owners_test2

2014-05-12 16:41:27,834 INFO [main] HiveMetaStore.audit ( - ugi=oracle ip=unknown-ip-addr cmd=get_table : db=default tbl=owners_test2

2014-05-12 16:41:27,839 INFO [main] DataNucleus.Datastore ( - The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.

2014-05-12 16:41:27,840 INFO [main] DataNucleus.Datastore ( - The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.

oracle.hadoop.exttab.ExternalTableException: Unsupported Hive table serialization library org.apache.hadoop.hive.contrib.serde2.JsonSerde

  at oracle.hadoop.exttab.hive.HiveSource.initialize(

  at oracle.hadoop.exttab.hive.HiveSource.getDataSet(

  at oracle.hadoop.exttab.ExternalTable.doCreateTable(




  at oracle.hadoop.exttab.ExternalTable.main(

  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

  at sun.reflect.NativeMethodAccessorImpl.invoke(

  at sun.reflect.DelegatingMethodAccessorImpl.invoke(

  at java.lang.reflect.Method.invoke(

  at org.apache.hadoop.util.RunJar.main(

any suggestions somebody? i literally found nothing in google!

what are your prefered ways to load json-data from hdfs to oracle-tables?



  • Mannamal-Oracle
    Mannamal-Oracle Member Posts: 260 Employee

    Oracle SQL Connector for HDFS' support for Hive tables is limited to Hive tables over delimited text files, as documented here.   For JSON files such as in your example, you can create a new Hive table which uses the JSON SerDe to create a delimited text representation of the data, and then use Oracle SQL Connector for HDFS.  

    Alternatively, you can use Oracle Loader for Hadoop that can read Hive tables over any format Hive supports, and create Oracle Data Pump files from the JSON files.   Oracle Data Pump files can then be queried by Oracle SQL Connector for HDFS.

  • dvohra21
    dvohra21 Member Posts: 14,325 Gold Crown

    Formats other than text file are not supported by OSCH.

This discussion has been closed.