[英]How do I get PySpark/spark to run on both python2 and 3 on my computer?
I have Spark/Pyspark for python 2.7. 我有适用于python 2.7的Spark / Pyspark。
I also have python 3.6 on my computer. 我的电脑上也有python 3.6。
How can I get it spark/pyspark to automatically run on python3.6? 如何获得自动在python3.6上运行的spark / pyspark?
Spark/pyspark will only currently run on 2.7 for my computer Spark / pyspark目前仅在我的计算机上运行2.7
Use: 采用:
PYSPARK_PYTHON
environment variable to set Python interpreter for executors. PYSPARK_PYTHON
环境变量,用于为执行程序设置Python解释器。 PYSPARK_DRIVER_PYTHON
environment variable to set Python interpret for the driver. PYSPARK_DRIVER_PYTHON
环境变量,用于为驱动程序设置Python解释。 PYSPARK_PYTHON
should be set on each worker node. PYSPARK_PYTHON
应该在每个工作节点上设置。 It is best to use $SPARK_HOME/conf/spark-env.sh
, ie 最好使用
$SPARK_HOME/conf/spark-env.sh
,即
$ cat $SPARK_HOME/conf/spark-env.sh
#!/usr/bin/env bash
PYSPARK_PYTHON={which python3.6}
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.