简体   繁体   中英

Running SPARK job with local Python

I am submitting a spark job as follows :

spark-submit --conf spark.ui.port=5051 server_code.py

My python is set to my home users python :

export PYSPARK_PYTHON="$HOME/software/anaconda3/bin/python3.7"

However when I run spark like this it cannot access this python so complains. Is there any way around this? I am able to run spark jobs with the above configurations when the job doesnt involve a web UI. In the case above, I am serving out a web UI with the results.

java.io.IOException: Cannot run program "/x/software/anaconda3/bin/python3.7": error=13, Permission denied

I solved my problem by providing access permissions to folders from the home directory. chmod 777 dir name all the way down to chmod 777 to the actual python script

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM