简体   繁体   English

Scrapy 安装错误 pip 和 easy_install

[英]Scrapy installation error pip and easy_install

I am trying to install Scrapy on Windows and have followed steps in the Scrapy installation guide to install all the dependencies.我正在尝试在 Windows 上安装 Scrapy,并按照 Scrapy 安装指南中的步骤安装所有依赖项。 However, I got the following error message when I tried to use easy_install :但是,当我尝试使用easy_install时收到以下错误消息:

Download error on https://pypi.python.org/simple/Scrapy/: **[Errno 10061] 
No connection could be made because the target machine actively refused 
it -- Some packages may not be found!**
Couldn't find index page for 'Scrapy' (maybe misspelled?)

Scanning index of all packages (this may take a while)
Reading https://pypi.python.org/simple/
Download error on https://pypi.python.org/simple/: **[Errno 10061] 
No connection could be made because the target machine actively refused 
it -- Some packages may not be found!**
No local packages or download links found for Scrapy
error: Could not find suitable distribution for 
Requirement.parse('Scrapy')

I also tried to use pip but it doesn't work either:我也尝试使用pip但它也不起作用:

Downloading/unpacking Scrapy
 **Cannot fetch index base URL https://pypi.python.org/simple/
  Could not find any downloads that satisfy the requirement Scrapy**
Cleaning up...
No distributions at all found for Scrapy

I went to pip.log and it says我去了 pip.log,它说

Could not fetch URL https://pypi.python.org/simple/Scrapy/: **connection
error: HTTPSConnectionPool(host='pypi.python.org', port=443): Max 
retries exceeded with url: /simple/Scrapy/ (Caused by 
<class 'socket.error'>: [Errno 10061] No connection could be made 
because the target machine actively refused it)**
  Will skip URL https://pypi.python.org/simple/Scrapy/ when looking 
for download links for Scrapy

I can access https://pypi.python.org/simple/Scrapy/ directly from internet explorer but not sure why pip or easy_install can't access the link.我可以直接从 Internet Explorer 访问https://pypi.python.org/simple/Scrapy/但不确定为什么 pip 或 easy_install 无法访问该链接。

I'm using Anaconda python so easy_install and pip are already part of the package.我正在使用 Anaconda python,所以 easy_install 和 pip 已经是包的一部分。 I also had the following error at the last stage of installing pywin32 :在安装pywin32的最后阶段,我也遇到了以下错误:

close failed in file object destructor:在文件对象析构函数中关闭失败:

sys.excepthook is missing sys.excepthook 丢失

lost sys.stderr丢失的sys.stderr

Is this the reason easy_install and pip failed?这是easy_install和pip失败的原因吗? Could anyone help with these please?有人可以帮忙吗?

Step By Step way to install scrapy on Windows 7在 Windows 7 上安装 scrapy 的分步方法

  1. Install Python 2.7 from Python Download link (Be sure to install Python 2.7 only because currently scrapy is not available for Python3 in Windows)Python 下载链接安装 Python 2.7(请确保仅安装 Python 2.7,因为当前在 Windows 中,Python3 不支持 scrapy)
  2. During pyhton install there is checkbox available to add python path to system variable click that option.在 pyhton 安装期间,有一个复选框可用于将 python 路径添加到系统变量,单击该选项。 Otherwise you can add path variable manually.否则,您可以手动添加路径变量。 You need to adjust PATH environment variable to include paths to the Python executable and additional scripts.您需要调整 PATH 环境变量以包含 Python 可执行文件和其他脚本的路径。 The following paths need to be added to PATH C:\\Python27\\;C:\\Python27\\Scripts\\;以下路径需要添加到PATH C:\\Python27\\;C:\\Python27\\Scripts\\; windows 添加路径变量

If you have any other problem in adding path variable please refer to this link如果您在添加路径变量时遇到任何其他问题,请参阅此链接
3. To update the PATH open a Command prompt in administration mode and run: :\\python27\\python.exe c:\\python27\\tools\\scripts\\win_add2path.py .Close the command prompt window and reopen it so changes take effect, run the following command, to check ever thing added to path variable. 3. 要更新 PATH 在管理模式下打开命令提示符并运行: :\\python27\\python.exe c:\\python27\\tools\\scripts\\win_add2path.py命令提示符窗口并重新打开它以使更改生效,运行以下命令,检查添加到路径变量的所有内容。
python -–version which will give output as Python 2.7.12 (your version might be different than mine) python -–version将输出为Python 2.7.12 (您的版本可能与我的不同)
pip --version which will give output as pip 9.0.1 (your version might be different than mine) pip --version将输出为pip 9.0.1 (您的版本可能与我的不同)
4. You need to install visual basic C++ Python complier. 4、需要安装visual basic C++ Python编译器。 You can download that from Download link你可以从 下载链接下载
5. Then you install to install libxml which python library used by scrapy. 5.然后安装scrapy使用的python库libxml。 You download it by writing a command pip install libxml into command prompt.您可以通过将命令pip install libxml写入命令提示符来下载它。 but if you face some problem in pip installation you can download it from http://www.lfd.uci.edu/~gohlke/pythonlibs/#lxml download libxml package according to your system architecture .但是如果您在 pip 安装中遇到一些问题,您可以根据您的系统架构http://www.lfd.uci.edu/~gohlke/pythonlibs/#lxml下载libxml Open command prompt into that download directory and pip install NAME_OF_PACKAGE.whl打开命令提示符进入该下载目录并pip install NAME_OF_PACKAGE.whl
6. Install pywin32 from Download link . 6. 从下载链接安装 pywin32。 Be sure you download the architecture (win32 or amd64) that matches your system确保下载与您的系统匹配的架构(win32 或 amd64)
7. Then open command prompt and run this command pip install scrapy 7.然后打开命令提示符并运行此命令pip install scrapy
I hope this will help in successful installing scrapy 8. For Reference use can your these links Scrapy official Page and Blog on how to install scrapy on windows我希望这有助于成功安装scrapy 8。作为参考使用,您可以通过这些链接Scrapy官方页面博客了解如何在Windows上安装scrapy

How to install Scrapy 1.4 on Python 3.6 on Windows 8.1 Pro x64如何在 Windows 8.1 Pro x64 上的 Python 3.6 上安装 Scrapy 1.4

pip install virtualenv
pip install virtualenvwrapper
pip install virtualenvwrapper-win
mkvirtualenv my_scrapy_project

I advice to use virtualenv.我建议使用 virtualenv。 In my example I am using name my_scrapy_project for my virtual environment.在我的示例中,我为我的虚拟环境使用名称my_scrapy_project If you want to go out of virtualenv, simply type deactivate , if you want to go back into, simply type workon my_scrapy_project .如果你想要去的virtualenv出来,只需键入停用,如果你想回去到,只需键入workon my_scrapy_project。

pip install lxml-4.1.1-cp36-cp36m-win32.whl

pip install scrapy

And that is all, it should work.这就是全部,它应该工作。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM