[英]RUN pip install -r requirements.txt does not install requirements in docker container
我是 django、docker 和scrapy 的新手,我正在尝试运行一个也使用scrapy 的django 应用程序(我基本上创建了一个django 应用程序,它也是一个scrapy 应用程序,并尝试从django 视图调用蜘蛛)。 尽管在 requirements.txt 中指定了这个scrapy
并从 Dockerfile 运行 pip,但在运行python manage.py runserver 0.0.0.0:8000
之前,依赖项并未安装在容器中,并且 django 应用程序在系统检查期间失败,导致 web由于以下异常而退出的容器:
| Exception in thread django-main-thread:
web_1 | Traceback (most recent call last):
web_1 | File "/usr/local/lib/python3.7/threading.py", line 926, in _bootstrap_inner
web_1 | self.run()
web_1 | File "/usr/local/lib/python3.7/threading.py", line 870, in run
web_1 | self._target(*self._args, **self._kwargs)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/utils/autoreload.py", line 54, in wrapper
web_1 | fn(*args, **kwargs)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/core/management/commands/runserver.py", line 117, in inner_run
web_1 | self.check(display_num_errors=True)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/core/management/base.py", line 390, in check
web_1 | include_deployment_checks=include_deployment_checks,
web_1 | File "/usr/local/lib/python3.7/site-packages/django/core/management/base.py", line 377, in _run_checks
web_1 | return checks.run_checks(**kwargs)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/core/checks/registry.py", line 72, in run_checks
web_1 | new_errors = check(app_configs=app_configs)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/core/checks/urls.py", line 40, in check_url_namespaces_unique
web_1 | all_namespaces = _load_all_namespaces(resolver)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/core/checks/urls.py", line 57, in _load_all_namespaces
web_1 | url_patterns = getattr(resolver, 'url_patterns', [])
web_1 | File "/usr/local/lib/python3.7/site-packages/django/utils/functional.py", line 80, in __get__
web_1 | res = instance.__dict__[self.name] = self.func(instance)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/urls/resolvers.py", line 579, in url_patterns
web_1 | patterns = getattr(self.urlconf_module, "urlpatterns", self.urlconf_module)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/utils/functional.py", line 80, in __get__
web_1 | res = instance.__dict__[self.name] = self.func(instance)
web_1 | File "/usr/local/lib/python3.7/site-packages/django/urls/resolvers.py", line 572, in urlconf_module
web_1 | return import_module(self.urlconf_name)
web_1 | File "/usr/local/lib/python3.7/importlib/__init__.py", line 127, in import_module
web_1 | return _bootstrap._gcd_import(name[level:], package, level)
web_1 | File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
web_1 | File "<frozen importlib._bootstrap>", line 983, in _find_and_load
web_1 | File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
web_1 | File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
web_1 | File "<frozen importlib._bootstrap_external>", line 728, in exec_module
web_1 | File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
web_1 | File "/code/composeexample/urls.py", line 21, in <module>
web_1 | path('scrapy/', include('scrapy_app.urls')),
web_1 | File "/usr/local/lib/python3.7/site-packages/django/urls/conf.py", line 34, in include
web_1 | urlconf_module = import_module(urlconf_module)
web_1 | File "/usr/local/lib/python3.7/importlib/__init__.py", line 127, in import_module
web_1 | return _bootstrap._gcd_import(name[level:], package, level)
web_1 | File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
web_1 | File "<frozen importlib._bootstrap>", line 983, in _find_and_load
web_1 | File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
web_1 | File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
web_1 | File "<frozen importlib._bootstrap_external>", line 728, in exec_module
web_1 | File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
web_1 | File "/code/scrapy_app/urls.py", line 4, in <module>
web_1 | from scrapy_app import views
web_1 | File "/code/scrapy_app/views.py", line 1, in <module>
web_1 | from scrapy.crawler import CrawlerProcess
web_1 | ModuleNotFoundError: No module named 'scrapy'
我尝试使用pip3
而不是 pip, pip install --no-cache-dir -r requirements.txt
,更改了 Dockerfile 中语句的顺序,我还检查了Scrapy==1.7.3
出现在requirements.txt
。 似乎没有任何效果。
这是我的 Dockerfile:
FROM python:3
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
COPY requirements.txt /code/
RUN pip install -r requirements.txt
COPY . /code/
这是我的 docker-compose.yml:
version: '3'
services:
db:
image: postgres
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
好像你缺乏scrapy
在requirements.txt
!
我试图用你的所有组件构建一个最小版本。 希望能帮助到你。
测试文件
import scrapy
from time import sleep
def main():
while True:
print(scrapy)
sleep(1)
if __name__ == "__main__":
main()
要求.txt
Scrapy==1.7.3
文件
FROM python:3
ENV PYTHONUNBUFFERED 1
WORKDIR /code
COPY requirements.txt .
RUN pip3 install -r requirements.txt
COPY . ./
CMD [ "python3", "test.py" ]
docker-compose.yml
version: '3'
services:
db:
image: postgres
web:
build: .
ports:
- "8000:8000"
depends_on:
- db
有点晚了,但我遇到了这个问题并最终解决了它(我把它放在这里给其他有同样问题的人)。
我第一次尝试构建我的 docker 镜像时,我的requirements.txt
不包含所需的模块。 当然,我添加了所需的模块但似乎什么也没发生,这是因为我们需要从头开始重建容器,否则我们只是一次又一次地尝试构建相同的版本。
要使用我们更新的文件重建容器,我们写:
docker-compose rm -f
docker-compose pull
docker-compose up
如果这不起作用,请尝试相同但用docker-compose up --build -d
替换最后一行。
我从这个答案中得到了这个。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.