简体   繁体   中英

How do I get PySpark/spark to run on both python2 and 3 on my computer?

I have Spark/Pyspark for python 2.7.

I also have python 3.6 on my computer.

How can I get it spark/pyspark to automatically run on python3.6?

Spark/pyspark will only currently run on 2.7 for my computer

Use:

  • PYSPARK_PYTHON environment variable to set Python interpreter for executors.
  • PYSPARK_DRIVER_PYTHON environment variable to set Python interpret for the driver.

PYSPARK_PYTHON should be set on each worker node. It is best to use $SPARK_HOME/conf/spark-env.sh , ie

$ cat $SPARK_HOME/conf/spark-env.sh
#!/usr/bin/env bash

PYSPARK_PYTHON={which python3.6}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM