简体   繁体   中英

Shell script not running R (Rhipe) program from Java code

I have a simple shell script which looks like this:

R --vanilla<myMRjob.R
hadoop fs -get /output_03/ /home/user/Desktop/hdfs_output/

This shell script runs myMRjob.R, and gets the output from hdfs to local file system. It executes fine from terminal.

When i am trying to run shell script from java code, i am unable to launch the MapReduce job ie the first line isn't getting executed. While "hadoop fs -get .." line is running fine through Java code.

Java code which i used is:

import java.io.*;

public class Dtry {

   public static void main(String[] args) {

       File wd = new File("/home/dipesh/");
       System.out.println("Working Directory: " +wd);
       Process proc = null;

       try {
           proc = Runtime.getRuntime().exec("./Recomm.sh", null, wd);
       } catch (Exception e) {
         e.printStackTrace();
         }
   }
}

The reason behind this whole exercise is that i want to trigger and display the result of the myMRjob.R in JSP.

Please help!

The reason your shell script isn't running from the exec call is because shell scripts are really just text files and they aren't native executables. It is the shell (Bash) that knows how to interpret them. The exec call is expecting to find a native executable binary.

Adjust your Java like this in order to call the shell and have it run your script:

proc = Runtime.getRuntime().exec("/bin/bash Recomm.sh", null, wd);

When you called hadoop directly from Java, it is a native executable and that's why it worked.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM