简体   繁体   中英

SparkContext not found on windows7

I have install sparks for pyspark using the method mentioned in this link..

http://nishutayaltech.blogspot.in/2015/04/how-to-run-apache-spark-on-windows7-in.html

Now I am creating pyspark and trying to use "sc" variable.But I am getting below error.

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
NameError: name 'sc' is not defined

I tried to below variables

from pyspark import SparkContext
SparkContext.setSystemProperty('spark.executor.memory', '2g')
sc = SparkContext("local", "App Name") 

the error i am getting is:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "D:\BIGDATA\spark-2.1.0-bin-hadoop2.7\python\pyspark\context.py", line 115, in __init__
    SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
  File "D:\BIGDATA\spark-2.1.0-bin-hadoop2.7\python\pyspark\context.py", line 272, in _ensure_initialized
    callsite.function, callsite.file, callsite.linenum))
ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=PySparkShell, master=local[*]) created by getOrCreate at D:\BIGDATA\spark-2.1.0-bin-hadoop2.7\bin\..\python\pyspark\shell.py:43

Regarding the following error:

ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=PySparkShell, master=local[*]) created by getOrCreate at D:\\BIGDATA\\spark-2.1.0-bin-hadoop2.7\\bin..\\python\\pyspark\\shell.py:43

The source of this error seems to be a previous SparkContext which wasn't stopped.

Executing sc.stop() before trying to to create another SparkContext should solve the multiple SparkContexts error.

Some additional links for installing spark on Windows (from my experience some instructions are missing some details):

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM