简体   繁体   中英

JOB_TOO_BIG Pheanstalk - what can be done?

On Laravel 4.2 & Laravel Forge

I Made a mistake and accidentally pushed some code on to the production sever, but there was a bug and it pushed a job to the queue without deleting it once done. Now I can't push anything in the queue anymore, I get:

Pheanstalk_Exception JOB_TOO_BIG: job data exceeds server-enforced limit

What can I do?

You can increase the max job size with the -z option for Beanstalkd: http://linux.die.net/man/1/beanstalkd

To do this on Forge you need to SSH into the server and edit the /etc/default/beanstalkd file.

Add the following line (or uncomment the existing BEANSTALKD_EXTRA line and edit it): BEANSTALKD_EXTRA="-z 524280"

Restart beanstalkd after making the change: sudo service beanstalkd restart

The size should be specified in bytes.

I am not sure if this could have serious performance effects - so far, so good for me. I would appreciate any comments on performance.

This is because you're trying to store too much data in the queue itself. Try to cut down the data you're pushing to the queue.

For example if your queue job involves using models, just pass the model ID into the queue and as part of the job fetch them from the database, rather than passing the queue the entire model instance.

If you're using eloquent models, they're automatically handled in this way.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM