简体   繁体   中英

Scrapy import error: No module named items

I set-up my project using

scrapy startproject can411

so the __init__.py files are present.

This is the code within my spider canada_411Spider.py (not named the same as the project folder)

from scrapy.spider import BaseSpider
from scrapy.selector import HtmlXPathSelector

from can411.items import Can411Item

When I try to import Can411Item throws a module not found error. (Yes items.py file exists and has class named Can411Item)

I can fix the prob by editing the PYTHONPATH var to include

"C:\Python26\ArcGIS10.0;F:\TOOLS\Python Development\ScrapyWork\can411" 

Just this seems like a extremely bad way of fixing my problem. Any items how to better fix this without hard-coding the path into the environment settings?

file structure is:

F:\TOOLS\Python Development\ScrapyWork\can411\can411\spiders

The files should be called __init__.py not __init.py__ .

What does your directory structure look like?

dirbot/
├── dirbot
│   ├── __init__.py
│   ├── items.py
│   ├── pipelines.py
│   ├── settings.py
│   └── spiders
│       ├── __init__.py
│       └── dmoz.py
├── README.rst
├── scrapy.cfg
└── setup.py

Also, maybe you could include the full stacktrace error output along with the command-line you are using.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM