简体   繁体   English

线程“main”中的异常java.io.IOException:无法运行程序“”:错误=2,没有这样的文件或目录

[英]Exception in thread "main" java.io.IOException: Cannot run program "": error=2, No such a file or directory

I faced this issue when executing following script我在执行以下脚本时遇到了这个问题

./spark-submit /home/*****/public_html/****/****.py ./spark-submit /home/*****/public_html/****/****.py

I first used python3.7.2 and later python3.5.2 but still get the following error message.我首先使用 python3.7.2 和后来的 python3.5.2 但仍然收到以下错误消息。

Exception in thread "main" java.io.IOException: Cannot run program "": error=2, No such a file or directory.
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
    at org.apache.spark.deploy.PythonRunner$.main(PythonRunner.scala:100)
    at org.apache.spark.deploy.PythonRunner.main(PythonRunner.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.io.IOException: error=2, No such a file or directory
    at java.lang.UNIXProcess.forkAndExec(Native Method)
    at java.lang.UNIXProcess.<init>(UNIXProcess.java:247)
    at java.lang.ProcessImpl.start(ProcessImpl.java:134)
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)... 12 more`

Before it, I have several message output as在它之前,我有几个消息输出为

2019-02-07 11:30:18 WARN  Utils:66 - Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using xxx.xxx.xxx.xxx instead (on interface eth0)
2019-02-07 11:30:18 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2019-02-07 11:30:19 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

I am able to exec python3 -VI am able to launch spark-shell and pyspark我能够执行 python3 -VI 能够启动 spark-shell 和 pyspark

And I feel very strange that no message is shown between "".而且我觉得很奇怪,“”之间没有显示任何信息。

For my python code, it starts with对于我的 python 代码,它以

import sys
import urllib3
import requests

from pyspark import SparkContext
from pyspark.sql import SQLContext
from pyspark.sql.types import StructType, StructField
from pyspark.sql.types import DoubleType, IntegerType, StringType

from CommonFunctions import *
from LanguageCodeParser import *

I also tried with a very simple python code我还尝试了一个非常简单的 python 代码

print("This is a test.")

Here are some message after execute bash -x spark-submit test.py这是执行bash -x spark-submit test.py后的一些消息

+ '[' -z /opt/spark-2.3.2-bin-hadoop2.7 ']'
+ export PYTHONHASHSEED=0
+ PYTHONHASHSEED=0
+ exec /opt/spark-2.3.2-bin-hadoop2.7/bin/spark-class org.apache.spark.deploy.SparkSubmit test.py

However, it doesn't work.但是,它不起作用。 Thanks for your helps in advance.提前感谢您的帮助。

I found that setting PYSPARK_PYTHON=/usr/bin/python3 is useful我发现设置PYSPARK_PYTHON=/usr/bin/python3很有用

It will be good if this Environment Variable can be setup in如果这个环境变量可以设置在

/opt/spark-2.3.2-bin-hadoop2.7/conf/spark-env.sh

as mentioned above, but in case of windows the spark-env.sh.template should be renamed to spark-env.cmd and the following should be inserted.如上所述,但在 windows 的情况下, spark-env.sh.template 应重命名为 spark-env.cmd 并应插入以下内容。

  set PYSPARK_PYTHON=C:\Users\usr\anaconda3\python.exe
  set PYSPARK_DRIVER_PYTHON=C:\Users\usr\anaconda3\python.exe
  set HADOOP_HOME=c:\winutils

please remove all comments starting with "#" in this file.请删除此文件中所有以“#”开头的注释。 Hope the above works for you as well希望以上对你也有用

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 (PySpark)路径错误:线程“ main”中的异常java.io.ioexception无法运行程序“ python” - (PySpark) Pathing error: exception in thread “main” java.io.ioexception cannot run program “python” java.io.IOException: 无法运行程序“...”: java.io.IOException: error=2, No such file or directory - java.io.IOException: Cannot run program “…”: java.io.IOException: error=2, No such file or directory java.io.IOException:无法运行程序“/usr/bin/sh”:java.io.IOException:error=2,没有那个文件或目录 - java.io.IOException: Cannot run program “/usr/bin/sh”: java.io.IOException: error=2, No such file or directory java.io.IOException:无法运行程序“ C:\\ AutoIt \\ ModenaAutoIt.exe”:java.io.IOException:error = 2,没有此类文件或目录 - java.io.IOException: Cannot run program “C:\AutoIt\ModenaAutoIt.exe”: java.io.IOException: error=2, No such file or directory 无法启动测试系统&#39;slim&#39;:java.io.IOException:无法运行程序“ java”:error = 2,没有这样的文件或目录 - Unable to start test system 'slim': java.io.IOException: Cannot run program “java”: error=2, No such file or directory ant jar错误:执行失败:java.io.IOException:无法运行程序... $ {aapt}“:error = 2,没有这样的文件或目录 - ant jar error: Execute failed: java.io.IOException: Cannot run program…${aapt}": error=2, No such file or directory JVM无法启动:java.io.IOException:无法运行程序“/ usr / libexec / StartupItemContext; error = 2,没有这样的文件或目录 - JVM failed to start: java.io.IOException: Cannot run program "/usr/libexec/StartupItemContext; error=2, No such file or directory 导入失败:java.io.IOException:无法运行程序“hive”:错误=2,没有那个文件或目录 - Import failed: java.io.IOException: Cannot run program "hive": error=2, No such file or directory spark 2.0-java.io.IOException:无法运行程序“ jupyter”:error = 2,没有这样的文件或目录 - spark 2.0 - java.io.IOException: Cannot run program “jupyter”: error=2, No such file or directory java.io.IOException:无法运行程序“usr/bin/ffmpeg”:错误=2,没有那个文件或目录 - java.io.IOException: Cannot run program “usr/bin/ffmpeg ”: error=2, No such file or directory
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM