简体   繁体   中英

how export addresses into PATH for running pyspark in Amazon EC2

Hi I have installed Spark and Python and Jupyter Notebook in Amazon AWS EC2 but when I run "jupyter notebook" in the command prompt it just provides an address for "jupyter notebook" when I open the jupyter notebook, I can't run pyspark commands. I just can run python commands. I googled it and found these commands:

export SPARK_HOME=/home/ubuntu/spark-3.0.1-bin-hadoop3.2

export PATH=$SPARK_HOME/bin:$PATH

export PYTHONPATH=$SPARK_HOME/python:$PYTHONPATH

After executing them, when I typed "jupyter notebook" I access a "jupyter notebook" that can support pyspark. However, when I close my commands prompt and logged in later, I have to type the above commands again to be able to have "pyspark" in the "jupyter notebook". My question: how permanently save those variables in "PATH" of environment variable. And how can see all of the environment variables including the ones that I entered through above commands.

It's can be a similar case for environment variables and your local terminal. To have some extra variables exported, you need to save those export commands in your EC2 machine .bashrc file

More details can be easily found on the Inte.net, for example here

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM