简体   繁体   English

带有spyder的Anaconda:ImportError:无法导入名称“ SparkConf”

[英]Anaconda with spyder: ImportError: cannot import name 'SparkConf'

I have pyspark installed on testenv in anaconda (by using: conda install -c conda-forge pyspark ), it's here (I think) 我在anaconda的pyspark安装了pyspark(通过使用: conda install -c conda-forge pyspark ),它在这里(我认为)

/Users/myuser/anaconda3/envs/testenv1/lib/python3.6/site-packages/pyspark/python/pyspark

This path exists, next I start spyder : 此路径存在,接下来我启动spyder

(testenv1) ➜ ~ spyder

And this code yields the below error, I thought that site-packeges are automatically "included", or is it a different problem? 并且此代码产生以下错误,我认为site-packeges会自动“包含”,还是一个不同的问题?

import os
os.environ['SPARK_HOME'] = "/Users/myuser/anaconda3/envs/testenv1/lib/python3.6/site-packages/pyspark" # Not working but also not sure why I need to add this line at all pyspark appears to be in `site-packages`

from pyspark import SparkConf, SparkContext

conf = SparkConf().setMaster("local").setAppName("WordCount")
sc = SparkContext(conf = conf)

And I get the below error: 我得到以下错误:

runfile('/Users/myuser/dev/projects/python-snippets/pyspark.py', wdir='/Users/myuser/dev/projects/python-snippets')
Traceback (most recent call last):

  File "<ipython-input-1-969f4e596614>", line 1, in <module>
    runfile('/Users/myuser/dev/projects/python-snippets/pyspark.py', wdir='/Users/myuser/dev/projects/python-snippets')

  File "/Users/myuser/anaconda3/envs/testenv1/lib/python3.6/site-packages/spyder/utils/site/sitecustomize.py", line 705, in runfile
    execfile(filename, namespace)

  File "/Users/myuser/anaconda3/envs/testenv1/lib/python3.6/site-packages/spyder/utils/site/sitecustomize.py", line 102, in execfile
    exec(compile(f.read(), filename, 'exec'), namespace)

  File "/Users/myuser/dev/projects/python-snippets/pyspark.py", line 13, in <module>
    from pyspark import SparkConf, SparkContext

  File "/Users/myuser/dev/projects/python-snippets/pyspark.py", line 13, in <module>
    from pyspark import SparkConf, SparkContext

ImportError: cannot import name 'SparkConf'

Note that I have tried updating the python interpreter in spyder to: /Users/myuser/anaconda3/envs/testenv1/bin/python3.6 but I get the same exact error. 请注意,我已尝试将spyder中的python解释器更新为: /Users/myuser/anaconda3/envs/testenv1/bin/python3.6但我得到了相同的确切错误。

Is python-snippets/pyspark.py your file? python-snippets/pyspark.py您的文件吗? If yes the you should not use the name pyspark.py as it will conflict with the original pyspark package. 如果是,则不应使用名称pyspark.py因为它会与原始pyspark软件包冲突。

Please rename the file to something else and it should work 请将该文件重命名为其他文件,它应该可以工作

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM