[英]Docker-compose connection refused from celery
I am running docker-compose
to bring together django, celery, postgres and rabbitmq, with the following docker-compose.yml 我正在运行docker-compose
将django,celery,postgres和Rabbitmq与以下docker-compose.yml放在一起
version: '2'
services:
# PostgreSQL database
db:
image: postgres:9.4
hostname: db
environment:
- POSTGRES_USER=<XXX>
- POSTGRES_PASSWORD=<XXX>
- POSTGRES_DB=<XXX>
ports:
- "5431:5432"
rabbit:
hostname: rabbit
image: rabbitmq:3-management
environment:
- RABBITMQ_DEFAULT_USER=<XXX>
- RABBITMQ_DEFAULT_PASS=<XXX>
ports:
- "5672:5672"
- "15672:15672"
# Django web server
web:
build:
context: .
dockerfile: Dockerfile
hostname: web
command: /srv/www/run_web.sh
volumes:
- .:/srv/www
ports:
- "8000:8000"
links:
- db
- rabbit
depends_on:
- db
# Celery worker
worker:
hostname: celery
build:
context: .
dockerfile: Dockerfile
command: /srv/www/run_celery.sh
volumes:
- .:/srv/www
links:
- db
- rabbit
depends_on:
- rabbit
In one of the Django views I delegate out to a celery task which does some processing and then tries to post the results to another web service: 在一个Django视图中,我委派给一个celery任务,该任务进行一些处理,然后尝试将结果发布到另一个Web服务:
#views.py
@csrf_exempt
def process_data(request):
if request.method == 'POST':
#
#Processing to retrieve data here
#
delegate_celery_task.delay(data)
return HttpResponse(status=200)
#tasks.py
@app.task
def delegate_celery_task(in_data):
from extractorService.settings import MASTER_NODE
import json
import urllib
#
#Some processing on in_data here to give out_data
#
data = {'data': out_data}
params = json.dumps(data).encode('utf8')
req = urllib.request.Request('http://%s/api/data/'%(MASTER_NODE), data=params,
headers={'content-type': 'application/json'})
urllib.request.urlopen(req)
For now MASTER_NODE
is simply localhost:8001 where I am running the other web service. 现在, MASTER_NODE
只是本地主机:8001,我正在其中运行其他Web服务。 The setup runs when I run everything outside of docker. 当我在docker之外运行所有程序时,安装程序将运行。 On starting docker though the worker process gives: 在启动docker时,工作进程会给出:
worker_1 | [2016-11-28 12:20:17,527: WARNING/PoolWorker-2] unable to cache TLDs in file /usr/local/lib/python3.5/site-packages/tldextract/.tld_set: [Errno 13] Permission denied: '/ usr/local/lib/python3.5/site-packages/tldextract/.tld_set'
and then on posting to the Django view, the celery worker starts but gives an error on the urlopen call: 然后在发布到Django视图时,celery worker启动,但在urlopen调用中给出了错误:
worker_1 | Traceback (most recent call last): worker_1 | File "/usr/local/lib/python3.5/site-packages/celery/app/trace.py", line 368, in trace_task worker_1 | R = retval = fun(*args, **kwargs) worker_1 | File "/usr/local/lib/python3.5/site-packages/celery/app/trace.py", line 623, in protected_call worker_1 | return self.run(*args, **kwargs) worker_1 | File "/srv/extractor_django/extractorService/tasks.py", line 25, in extract_entities worker_1 | urllib.request.urlopen(req) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 162, in urlopen worker_1 | return opener.open(url, data, timeout) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 465, in open worker_1 | response = self._open(req, data) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 483, in _open worker_1 | '_open', req) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 443, in _call_chain worker_1 | result = func(*args) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 1268, in http_open worker_1 | return self.do_open(http.client.HTTPConnection, req) worker_1 | File "/usr/local/lib/python3.5/urllib/request.py", line 1242, in do_open worker_1 | raise URLError(err) worker_1 | urllib.error.URLError:
The celery config in settings.py is: settings.py中的celery配置为:
RABBIT_HOSTNAME = os.environ.get('RABBIT_PORT_5672_TCP', 'rabbit')
if RABBIT_HOSTNAME.startswith('tcp://'):
RABBIT_HOSTNAME = RABBIT_HOSTNAME.split('//')[1]
BROKER_URL = os.environ.get('BROKER_URL', '')
if not BROKER_URL:
BROKER_URL = 'amqp://{user}:{password}@{hostname}'.format(
user=os.environ.get('RABBIT_ENV_USER', '<XXX>'),
password=os.environ.get('RABBIT_ENV_RABBITMQ_PASS', '<XXX>'),
hostname=RABBIT_HOSTNAME)
BROKER_HEARTBEAT = '?heartbeat=30'
if not BROKER_URL.endswith(BROKER_HEARTBEAT):
BROKER_URL += BROKER_HEARTBEAT
BROKER_POOL_LIMIT = 1
BROKER_CONNECTION_TIMEOUT = 10
CELERY_DEFAULT_QUEUE = 'default'
CELERY_QUEUES = (
Queue('default', Exchange('default'), routing_key='default'),)
CELERY_ALWAYS_EAGER = False
CELERY_ACKS_LATE = True
CELERY_TASK_PUBLISH_RETRY = True
CELERY_DISABLE_RATE_LIMITS = False
CELERY_IGNORE_RESULT = True
CELERY_SEND_TASK_ERROR_EMAILS = False
CELERY_TASK_RESULT_EXPIRES = 600
CELERYD_HIJACK_ROOT_LOGGER = False
CELERYD_PREFETCH_MULTIPLIER = 1
CELERYD_MAX_TASKS_PER_CHILD = 1000
Does anyone have any ideas on how this could be fixed? 有人对如何解决这个问题有任何想法吗?
You did not mention the version of Celery, but from the post date I can guess it is v4. 您没有提到Celery的版本,但是从发布日期开始,我可以猜到它是v4。
I just had similar problem due to updating Celery from v3.1 to v4 and according to this tutorial it was needed to change BROKER_URL
to CELERY_BROKER_URL
in the settings.py
由于将Celery从v3.1更新到v4,我也遇到了类似的问题,根据本教程 ,需要在settings.py
BROKER_URL
CELERY_BROKER_URL
更改为CELERY_BROKER_URL
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.