This discussion is archived
1 Reply Latest reply: Dec 21, 2012 3:58 PM by Sherry LaMonica RSS

ORCH - Running R on existing files in HDFS,

966878 Newbie
Currently Being Moderated
is there a way to run R mapreduce via ORCH on existing files in HDFS, {i.e. files that were not created using ORCH},

I am unable to run R scripts on existing files on HDFS which are not in the path /user/oracle and which do not have R metadata, I had to create a script to convert existing files to R format, but not much luck...
  • 1. Re: ORCH - Running R on existing files in HDFS,
    Sherry LaMonica Journeyer
    Currently Being Moderated
    Hello Raj,

    Are you using our latest release, Oracle R Connector for Hadoop 2.0? The demo 'mapred_basic' provides a simple way of running map reduce tasks on HDFS data using ORCH. To run this demo at the Rconsole, type:

    demo('mapred_basic', package = "ORCH")

    Feel free to send me details on any specific problems you're experiencing.

    Best Regards,

    Sherry

Legend

  • Correct Answers - 10 points
  • Helpful Answers - 5 points