Forum Stats

  • 3,836,791 Users
  • 2,262,191 Discussions


ORCH - Running R on existing files in HDFS,

966878 Member Posts: 4
edited Dec 21, 2012 6:58PM in Big Data Connectors/Hadoop
is there a way to run R mapreduce via ORCH on existing files in HDFS, {i.e. files that were not created using ORCH},

I am unable to run R scripts on existing files on HDFS which are not in the path /user/oracle and which do not have R metadata, I had to create a script to convert existing files to R format, but not much luck...


  • Hello Raj,

    Are you using our latest release, Oracle R Connector for Hadoop 2.0? The demo 'mapred_basic' provides a simple way of running map reduce tasks on HDFS data using ORCH. To run this demo at the Rconsole, type:

    demo('mapred_basic', package = "ORCH")

    Feel free to send me details on any specific problems you're experiencing.

    Best Regards,

This discussion has been closed.