繁体   English   中英

设置时Django + Celery + Supervisord + Redis错误

[英]Django + Celery + Supervisord + Redis error when setting

我正在CentOS服务器上进行以下组件的设置。 我获得了监督任务来启动和运行该网站,但是在设置主管来管理芹菜方面受阻。 看来它可以识别任务,但是当我尝试执行任务时,它不会连接到它们。 我的Redis已启动并在端口6380上运行

Django==1.10.3
amqp==1.4.9
billiard==3.3.0.23
celery==3.1.25
kombu==3.0.37
pytz==2016.10

我的celeryd.ini

[program:celeryd]
command=/root/myproject/myprojectenv/bin/celery worker -A mb --loglevel=INFO


environment=PATH="/root/myproject/myprojectenv/bin/",VIRTUAL_ENV="/root/myproject/myprojectenv",PYTHONPATH="/root/myproject/myprojectenv/lib/python2.7:/root/myproject/myprojectenv/lib/python2.7/site-packages"

directory=/home/.../myapp/
user=nobody
numprocs=1
stdout_logfile=/home/.../myapp/log_celery/worker.log
sterr_logfile=/home/.../myapp/log_celery/worker.log
autostart=true
autorestart=true
startsecs=10

; Need to wait for currently executing tasks to finish at shutdown.
; Increase this if you have very long running tasks.
stopwaitsecs = 1200

; When resorting to send SIGKILL to the program to terminate it
; send SIGKILL to its whole process group instead,
; taking care of its children as well.
killasgroup=true

; Set Celery priority higher than default (999)
; so, if rabbitmq(redis) is supervised, it will start first.
priority=1000

该过程开始,当我转到项目文件夹并执行以下操作:

>python manage.py celery status
celery@ssd-1v: OK
1 node online.

当我打开celery的日志文件时,我看到任务已加载。

[tasks]
  . mb.tasks.add
  . mb.tasks.update_search_index
  . orders.tasks.order_created

我的mb / tasks.py

from mb.celeryapp import app
import django
django.setup()

@app.task
def add(x, y):
    print(x+y)
    return x + y

我的mb / celeryapp.py

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings

# set the default Django settings module for the 'celery' program.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "mb.settings")
app = Celery('mb', broker='redis://localhost:6380/', backend='redis://localhost:6380/')
app.conf.broker_url = 'redis://localhost:6380/0'
app.conf.result_backend = 'redis://localhost:6380/'
app.conf.timezone = 'Europe/Sofia'
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

我的mb / settings.py:

...
WSGI_APPLICATION = 'mb.wsgi.application'
BROKER_URL = 'redis://localhost:6380/0'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
STATICFILES_STORAGE = 'whitenoise.storage.CompressedManifestStaticFilesStorage'
...

当我跑步时:

python manage.py shell
>>> from mb.tasks import add
>>> add.name
'mb.tasks.add'
>>> result=add.delay(1,1)
>>> result.ready()
False
>>> result.status
'PENDING'

如前所述,我在日志中看不到任何更改。 如果我尝试从命令行运行:

/root/myproject/myprojectenv/bin/celery worker -A mb --loglevel=INFO
Running a worker with superuser privileges when the
worker accepts messages serialized with pickle is a very bad idea!

If you really want to continue then you have to set the C_FORCE_ROOT
environment variable (but please think about this before you do).

User information: uid=0 euid=0 gid=0 egid=0

但是我认为这是正常的,因为我是在没有用户的情况下运行它的。 有趣的是,命令celery status(不带python manage.py celery status)仅在连接时出现错误,可能是因为它正在寻找Redis的其他端口,但是supervisord的过程正常启动了……当我调用“芹菜工人-Amb'说没关系。 有任何想法吗?

(myprojectenv) [root@ssd-1v]# celery status                                      
Traceback (most recent call last):                                                          
  File "/root/myproject/myprojectenv/bin/celery", line 11, in <module>                      
    sys.exit(main())                                                                        
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/__main__.py", line 3
0, in main                                                                                  
    main()                                                                                  
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
 81, in main                                                                                
    cmd.execute_from_commandline(argv)                                                      
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
 793, in execute_from_commandline                                                           
    super(CeleryCommand, self).execute_from_commandline(argv)))                             
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/base.py", line 3
11, in execute_from_commandline                                                             
    return self.handle_argv(self.prog_name, argv[1:])                                       
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
 785, in handle_argv                                                                        
    return self.execute(command, argv)                                                      
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
 717, in execute                                                                            
    ).run_from_argv(self.prog_name, argv[1:], command=argv[0])                              
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/base.py", line 3
15, in run_from_argv                                                                        
    sys.argv if argv is None else argv, command)                                            
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/base.py", line 3
77, in handle_argv                                                                          
    return self(*args, **options)                                                           
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/base.py", line 2
74, in __call__                                                                             
    ret = self.run(*args, **kwargs)                                                         
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
 473, in run                                                                                
    replies = I.run('ping', **kwargs)                                                       
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
 325, in run                                                                                
    return self.do_call_method(args, **kwargs)                                              
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
 347, in do_call_method                                                                     
    return getattr(i, method)(*args)
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/app/control.py", line 100, in ping
    return self._request('ping')
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/app/control.py", line 71, in _request
    timeout=self.timeout, reply=True,
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/app/control.py", line 316, in broadcast
    limit, callback, channel=channel,
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/kombu/pidbox.py", line 283, in _broadcast
    chan = channel or self.connection.default_channel
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/kombu/connection.py", line 771, in default_channel
    self.connection
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/kombu/connection.py", line 756, in connection
    self._connection = self._establish_connection()
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/kombu/connection.py", line 711, in _establish_connection
    conn = self.transport.establish_connection()
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/kombu/transport/pyamqp.py", line 116, in establish_connection
    conn = self.Connection(**opts)
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/amqp/connection.py", line 165, in __init__
    self.transport = self.Transport(host, connect_timeout, ssl)
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/amqp/connection.py", line 186, in Transport
    return create_transport(host, connect_timeout, ssl)
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/amqp/transport.py", line 299, in create_transport
    return TCPTransport(host, connect_timeout)
  File "/root/myproject/myprojectenv/lib/python2.7/site-packages/amqp/transport.py", line 95, in __init__
    raise socket.error(last_err)
socket.error: [Errno 111] Connection refused

任何帮助将不胜感激。

更新:

当我跑步时

$:python manage.py shell
>>from mb.tasks import add
>>add
<@task: mb.tasks.add of mb:0x**2b3f6d0**>

0x 2b3f6d0与celery声称其日志中的存储空间不同,即:

 [config]
- ** ---------- .> app:         mb:0x3495bd0
- ** ---------- .> transport:   redis://localhost:6380/0
- ** ---------- .> results:     disabled://
- *** --- * --- .> concurrency: 1 (prefork)

好的,在这种情况下的答案是,gunicorn文件实际上是从通用python库而不是虚拟env启动项目的

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM