简体   繁体   中英

Scrapy installed, but won't run from the command line

I'm trying to run a scraping program I wrote for in python using scrapy on an ubuntu machine. Scrapy is installed. I can import until python no problem and when try pip install scrapy I get

Requirement already satisfied (use --upgrade to upgrade): scrapy in /system/linux/lib/python2.7/dist-packages

When I try to run scrapy from the command, with scrapy crawl... for example, I get.

The program 'scrapy' is currently not installed.

What's going on here? Are the symbolic links messed up? And any thoughts on how to fix it?

Without sudo, pip installs into $HOME/.local/bin, $HOME/.local/lib, etc. Add the following line to your ~/.bashrc or ~/.profile (or the appropriate place for other shells):

export PATH="${PATH}:${HOME}/.local/bin"

then open a new terminal or reload .bashrc, and it should find the command.

I had the same error. Running scrapy in a virtual environment solved it.

  1. Create a virtual env : python3 -m venv env
  2. Activate your env : source env/bin/activate
  3. Install Scrapy with pip : pip install scrapy
  4. Start your crawler : scrapy crawl your_project_name_here

For example my project name was kitten , I just did the following in step 4 scrapy crawl kitten

NOTE: I did this on Mac OS running Python 3+

I tried the following sudo pip install scrapy , however was promtly advised by Ubuntu 16.04 that it was already installed. I had to first use sudo pip uninstall scrapy , then sudo pip install scrapy for it to successfully install. Now you should successfully be able to run scrapy.

If you install scrapy only in virtualenv, then scrapy command isn't exists in your system bin directory. You could check it like this:

$ which scrapy

For me it is in(because I sudo installed it):

/usr/local/bin/scrapy

You could try full path to your scrapy. For example if it is installed in virtualenv:

(env) linux@hero:~dev/myscrapy$ python env/bin/scrapy

Note: We recommend installing Scrapy inside a virtual environment on all platforms.

I faced the same problem and solved using following method. I think scrapy is not usable by the current user.

  1. Uninstall scrapy.

    sudo pip uninstall scrapy

  2. Install scrapy again using -H.

    sudo -H pip install scrapy

  3. Should work properly.

I had the same issue. sudo pip install scrapy fixed my problem, although I don't know why must use sudo .

确保您激活“Scripts\\activate.bat”命令

A good way to work around is using pyenv to manage the python version.

$ brew install pyenv

# Any version 3.6 or above
$ pyenv install 3.7.3
$ pyenv global 3.7.3

# Update Environment to reflect correct python version controlled by pyenv
$ echo -e '\nif command -v pyenv 1>/dev/null 2>&1; then\n  eval "$(pyenv init -)"\nfi' >> ~/.zshrc

# Refresh Terminal
# or source $~/.zshrc
$ exec $0 

$ which python
/Users/mbbroberg/.pyenv/shims/python
$ python -V
Python 3.7.3

# Install scrapy
$ pip install scrapy
$ scrapy --version

Reference:

sudo pip install scrapy ,你应该添加sudo。

There was a question which I found an answer for that
And I can See that answer is this question's answer too
**

https://stackoverflow.com/a/73619133/18390571

**

scrapy crawl is not how you start a scrapy program. You start it by doing

scrapy startproject myprojectname

Then to actually start a scrapy program go into myprojectname/spiders and then you can call

scrapy crawl "yourspidername" 

To have scrapy create a spider you can cd into your directory and execute

scrapy genspider mydomain mydomain.com

Additionally, you can test if your scrapy actually works by executing

scrapy shell "google.com"

All this information can be found in their Documentation . If something happens then you have actually installed scrapy and you are crawling (haha) your way to success!

PS Scrapy does not work well on Python3 so if you're running on there and you still have troubles, use Python 2.7!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM