简体   繁体   English

如何将地址导出到 PATH 以在 Amazon EC2 中运行 pyspark

[英]how export addresses into PATH for running pyspark in Amazon EC2

Hi I have installed Spark and Python and Jupyter Notebook in Amazon AWS EC2 but when I run "jupyter notebook" in the command prompt it just provides an address for "jupyter notebook" when I open the jupyter notebook, I can't run pyspark commands.您好,我已经在 Amazon AWS EC2 中安装了 Spark 和 Python 以及 Jupyter Notebook,但是当我在命令提示符下运行“jupyter notebook”时,当我打开 jupyter notebook 时,它只提供了“jupyter notebook”的地址,我无法运行 pyspark 命令. I just can run python commands.我只能运行 python 命令。 I googled it and found these commands:我用谷歌搜索并找到了这些命令:

export SPARK_HOME=/home/ubuntu/spark-3.0.1-bin-hadoop3.2导出 SPARK_HOME=/home/ubuntu/spark-3.0.1-bin-hadoop3.2

export PATH=$SPARK_HOME/bin:$PATH导出 PATH=$SPARK_HOME/bin:$PATH

export PYTHONPATH=$SPARK_HOME/python:$PYTHONPATH导出 PYTHONPATH=$SPARK_HOME/python:$PYTHONPATH

After executing them, when I typed "jupyter notebook" I access a "jupyter notebook" that can support pyspark. However, when I close my commands prompt and logged in later, I have to type the above commands again to be able to have "pyspark" in the "jupyter notebook".执行它们后,当我键入“jupyter notebook”时,我会访问一个可以支持 pyspark 的“jupyter notebook”。但是,当我关闭命令提示符并稍后登录时,我必须再次键入上述命令才能拥有“ “jupyter notebook”中的“pyspark”。 My question: how permanently save those variables in "PATH" of environment variable.我的问题:如何将这些变量永久保存在环境变量的“PATH”中。 And how can see all of the environment variables including the ones that I entered through above commands.以及如何查看所有环境变量,包括我通过上述命令输入的环境变量。

It's can be a similar case for environment variables and your local terminal.对于环境变量和您的本地终端,情况可能类似。 To have some extra variables exported, you need to save those export commands in your EC2 machine .bashrc file要导出一些额外的变量,您需要将这些导出命令保存在 EC2 机器的.bashrc文件中

More details can be easily found on the Inte.net, for example here可以在 Inte.net 上轻松找到更多详细信息,例如此处

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM