简体   繁体   中英

Anaconda with spyder: ImportError: cannot import name 'SparkConf'

I have pyspark installed on testenv in anaconda (by using: conda install -c conda-forge pyspark ), it's here (I think)

/Users/myuser/anaconda3/envs/testenv1/lib/python3.6/site-packages/pyspark/python/pyspark

This path exists, next I start spyder :

(testenv1) ➜ ~ spyder

And this code yields the below error, I thought that site-packeges are automatically "included", or is it a different problem?

import os
os.environ['SPARK_HOME'] = "/Users/myuser/anaconda3/envs/testenv1/lib/python3.6/site-packages/pyspark" # Not working but also not sure why I need to add this line at all pyspark appears to be in `site-packages`

from pyspark import SparkConf, SparkContext

conf = SparkConf().setMaster("local").setAppName("WordCount")
sc = SparkContext(conf = conf)

And I get the below error:

runfile('/Users/myuser/dev/projects/python-snippets/pyspark.py', wdir='/Users/myuser/dev/projects/python-snippets')
Traceback (most recent call last):

  File "<ipython-input-1-969f4e596614>", line 1, in <module>
    runfile('/Users/myuser/dev/projects/python-snippets/pyspark.py', wdir='/Users/myuser/dev/projects/python-snippets')

  File "/Users/myuser/anaconda3/envs/testenv1/lib/python3.6/site-packages/spyder/utils/site/sitecustomize.py", line 705, in runfile
    execfile(filename, namespace)

  File "/Users/myuser/anaconda3/envs/testenv1/lib/python3.6/site-packages/spyder/utils/site/sitecustomize.py", line 102, in execfile
    exec(compile(f.read(), filename, 'exec'), namespace)

  File "/Users/myuser/dev/projects/python-snippets/pyspark.py", line 13, in <module>
    from pyspark import SparkConf, SparkContext

  File "/Users/myuser/dev/projects/python-snippets/pyspark.py", line 13, in <module>
    from pyspark import SparkConf, SparkContext

ImportError: cannot import name 'SparkConf'

Note that I have tried updating the python interpreter in spyder to: /Users/myuser/anaconda3/envs/testenv1/bin/python3.6 but I get the same exact error.

Is python-snippets/pyspark.py your file? If yes the you should not use the name pyspark.py as it will conflict with the original pyspark package.

Please rename the file to something else and it should work

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM