[英]how to make pyspark - in windows command prompt - run jupyter notebook
I have anaconda python running Jupyter perfectly i have Hadoop, yarn and spark running on windows 10 cmd perfectly.我有 anaconda python 完美运行 Jupyter 我有 Hadoop、yarn 和 spark 在 Windows 10 cmd 上完美运行。 I changed a lot of variables in the system of windows but now works fine
我在 windows 系统中更改了很多变量,但现在工作正常
when running PySpark, it works运行 PySpark 时,它可以工作
but I want to start Jupyter notebook when I run PySpark on cmd, and cannot但是我想在 cmd 上运行 PySpark 时启动 Jupyter notebook,但不能
This should work, go on powershell change this env variables.这应该有效,继续 powershell 更改此环境变量。
$env:PYSPARK_DRIVER_PYTHON=jupyter
$env:PYSPARK_DRIVER_PYTHON_OPTS='notebook'
or CMD或 CMD
SET PYSPARK_DRIVER_PYTHON=jupyter
set PYSPARK_DRIVER_PYTHON_OPTS='notebook'
and then run pyspark, it will automatically execute your jupyter notebook.然后运行 pyspark,它会自动执行你的 jupyter notebook。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.