简体   繁体   中英

the proper way to run django rq in docker microservices setup

I have somehow bad setup of my docker containers I guess. Because each time I run task from django I see in docker container output of ps aux that there is new process created of python mange.py rqworker mail instead of using the existing one. See the screencast: https://imgur.com/a/HxUjzJ5

the process executed from command in my docker compose for rq worker container looks like this.

#!/bin/sh -e

wait-for-it

for KEY in $(redis-cli -h $REDIS_HOST -n 2 KEYS "rq:worker*"); do
    redis-cli -h $REDIS_HOST -n 2  DEL $KEY
done

if [ "$ENVIRONMENT" = "development" ]; then
    python manage.py rqworkers --worker-class rq.SimpleWorker --autoreload;
else
    python manage.py rqworkers --worker-class rq.SimpleWorker --workers 4;
fi

I am new to docker and wondering a bit that this is started like this without deamonization... but is it a dockerish way of doing thing, right?

Here's what I do, with docker-compose:

version: '3'

services:
  web:
    build: .
    image: mysite
    [...]
  rqworker:
    image: mysite
    command: python manage.py rqworker
    [...]
  rqworker_high:
    image: mysite
    command: python manage.py rqworker high
    [...]

Then start with:

$ docker-compose up --scale rqworker_high=4

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM