[英]Linux Python Scrapy No module named six.moves
We want to use scrapy in linux machine.我们想在linux机器上使用scrapy。 We use python 2.7 version and install scrapy 1.4.0 (pip install scrapy).我们使用python 2.7版本并安装scrapy 1.4.0(pip install scrapy)。 We add import scrapy to .py file.我们将 import scrapy 添加到 .py 文件中。 When we run .py file, give error like below:当我们运行 .py 文件时,出现如下错误:
File "mapper.py", line 5, in <module>
import scrapy
File "/usr/local/lib/python2.7/dist-packages/scrapy/__init__.py", line 27, in <module>
from . import _monkeypatches
File "/usr/local/lib/python2.7/dist-packages/scrapy/_monkeypatches.py", line 2, in <module>
from six.moves import copyreg
ImportError: No module named **six.moves**
We've searched this issue but can not get any answers.我们已经搜索了这个问题,但没有得到任何答案。 How can we solve this issue ?我们如何解决这个问题? Thanks.谢谢。
Finally we found answer like below:最后我们找到了如下答案:
import os, imp
def load_src(name, fpath):
import os, imp
return imp.load_source(name, os.path.join(os.path.dirname(__file__), fpath))
load_src("six", "./six.py")
We import six.py from our own path then can use it finally.我们从我们自己的路径中导入six.py,然后最终可以使用它。 Actually it is a workaround solution, I think the main problem about python environment in linux server.实际上这是一个变通的解决方案,我认为主要问题是关于 linux 服务器中的 python 环境。 But in this case we can not access linux machine and lots of python version installed so python's own library six.py somehow couldn't be found.但是在这种情况下,我们无法访问 linux 机器并且安装了很多 python 版本,因此无法找到 python 自己的库 Six.py。 So we use this solution and it worked.所以我们使用这个解决方案并且它起作用了。
Please install six module if you have not installed yet.如果您还没有安装,请安装六个模块。
Install cmd: pip install six
and than import using: import six
安装 cmd: pip install six
然后导入: import six
I was getting same error and mine was fixed.我遇到了同样的错误,我的已修复。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.