简体   繁体   中英

Docker + Python, issues with own modules

I have a project structured like this:

docker-compose.yml
database>
    models.py
    __init__.py
datajobs>
    check_data.py
    import_data.py
    tasks_name.py
workers>
    Dockerfile
    worker.py
webapp>
    (flask app)

my docker-compose.yml

version: '2'

services:
  # Postgres database
  postgres:
    image: 'postgres:10.3'
    env_file:
      - '.env'
    volumes:
      - 'postgres:/var/lib/postgresql/data'
    ports:
      - '5432:5432'

  # Redis message broker
  redis:
    image: 'redis:3.0-alpine'
    command: redis-server
    volumes:
      - 'redis:/var/lib/redis/data'
    ports:
      - '6379:6379'

#  Flask web app
#  webapp:
#    build: webapp/.
#    command: >
#        gunicorn -b 0.0.0.0:8000
#        --access-logfile -
#        --reload
#        app:create_app()
#    env_file:
#      - '.env'
#    volumes:
#      - '.:/gameover'
#    ports:
#      - '8000:8000'

  # Celery workers to write and pull data + message APIs
  worker:
    build: ./worker
    env_file:
      - '.env'
    volumes:
      - '.:/gameover'
    depends_on:
      - redis

  beat:
    build: ./worker
    entrypoint: celery -A worker beat --loglevel=info
    env_file:
      - '.env'
    volumes:
      - '.:/gameover'
    depends_on:
      - redis

  # Flower server for monitoring celery tasks
  monitor:
    build:
      context: ./worker
      dockerfile: Dockerfile
    ports:
     - "5555:5555"
    entrypoint: flower
    command:  -A worker --port=5555 --broker=redis://redis:6379
    depends_on:
      - redis
      - worker

volumes:
  postgres:
  redis:

I want to reference the database modules, and datajobs in my worker. But in docker I can't copy a parent file (so I can't reference the module).

I'd prefer to keep them separate like this, because the flask app will also run these modules. Additionally, if I copy them into each folder there would be a lot of duplicate code.

So in the worker I want to do: from datajobs.data_pull import get_campaigns , but this module isn't copied over in the Dockerfile, as I can't reference it in the parent folder.

Dockerfile in worker

FROM python:3.6-slim
MAINTAINER Gameover

# Redis variables
ENV CELERY_BROKER_URL redis://redis:6379/0
ENV CELERY_RESULT_BACKEND redis://redis:6379/0

# Make worker directory, cd and copy files
ENV INSTALL_PATH /worker
RUN mkdir -p $INSTALL_PATH
WORKDIR /worker
COPY . .


# Install dependencies
RUN pip install -r requirements.txt

# Run the worker
ENTRYPOINT celery -A worker worker --loglevel=info

So, the answer to your question is pretty easy-

  worker:
    build:
        context: .
        dockerfile: ./worker
    env_file:
      - '.env'
    volumes:
      - '.:/gameover'
    depends_on:
      - redis

Then in your Dockerfile you can reference all of the paths and copy all of the code you need.

There are a couple other things I notice...

COPY . .
# Install dependencies
RUN pip install -r requirements.txt

This will make you reinstall all your dependencies on every code change. Instead do

COPY requirements.txt .
# Install dependencies
RUN pip install -r requirements.txt
COPY . .

So you only reinstall them when requirements.txt changes.

Finally- when I set this kind of thing up, I generally build a single image and just override the command to get workers and beats, so that I don't have to worry about which code is in which container- and my celery code uses many of the same modules as my flask app does. It will simplify your build process quite a bit... just a suggestion.

RUN pip install -r requirements.txt

Is the above command install content in project or code folder or directly to docker pre-build image of subsequent project or code.

Edit : can't Comment in above post due to reputation points

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM