I wanted to add celery to my docker file, which I added and when I deployed the changes to the server, I am able to see that celery is doing its job by executing the jobs faster. But, if I want to see the celery worker logs like "celery -A tasks worker --loglevel=info" , this works in local machine and I am able to see all the logs. But, in my server, since I am only able to see the logs of the docker container, can anyone guide me or tell me the command by which I can see whether celery is working fine or not in the production phase.
This is my docker-compose file structure:
services:
"name":
image: "image_name"
ports:
- "ports"
environment:
- "environment variables"
networks:
- "networks"
celery_worker:
build:
context: ./parsers
command: celery -A parsers worker --loglevel=info
volumes:
- ./parsers:/parsers
depends_on:
- "image"
Now, I want to see the logs of the celery worker too. Currently, I am able to see the server logs through the command sudo docker logs -f "name"
Are you sure that in your build of the Dockerfile
the celery
will be symlink from the celery
log to the stdout
or stderr
?
So, the answer to my question is:
Like I had created a docker file and added celery as a service, so this made a new image of celery in the docker container, and to check the logs of the celery worker, I used the following command sudo docker logs -f "Image_name_of_celery"
You can check all the running docker containers and images using the following command: sudo docker ps . This will show the results.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.