简体   繁体   中英

How to run bash commands via pyspark?

I am trying to run bash commands on spark via Python.

My simple current code is as follows:

import subprocess

print("Test start")
subprocess.Popen("conda install numpy=1.15.2 -n python35env--yes", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE).wait()
print("Test end")

The above code is getting executed successfully, and I do see both the print statement "Test start" and "Test end", but the numpy package is not getting installed in the python35env environment.

Am I missing anything important for running the conda install bash command in the Spark worker nodes? even on the driver node?

You can try to run it from the executable directory. By default subprocess run it from system.

subprocess.Popen("conda install numpy=1.15.2 -n python35env--yes", shell=True, stdout=subprocess.PIPE,
                     stderr=subprocess.PIPE, cwd = 'your conda excutable path')

Using subprocess .

Example:

import subprocess    
result = subprocess.getoutput('<bash command>')

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM