简体   繁体   English

Delayed_job 突然好像什么都没做?

[英]Delayed_job suddenly doesn't seem to do anything?

I have a scraper set up to use delayed_job so that it runs in the background.我有一个刮板设置为使用delayed_job,以便它在后台运行。

class Scraper
  def do_scrape
    # do some scraping stuff
  end
  handle_asynchronously :do_scrape
end

Now I can comment out the handle_asynchronously line, open the console and run the scraper just fine.现在我可以注释掉handle_asynchronously行,打开控制台并运行爬虫就好了。 It does exactly what I expect it to do.它完全符合我的预期。

However, when I try to fire the scrape as a delayed job, it doesn't seem to do anything at all.但是,当我尝试将刮擦作为一项延迟工作时,它似乎根本没有做任何事情。 Further to that, it doesn't seem to log anything important either.除此之外,它似乎也没有记录任何重要的东西。

Here's how my log looks from enqueueing a job to running rake jobs:work .这是我的日志从排队作业到运行rake jobs:work的样子。

County Load (1.0ms)  SELECT "counties".* FROM "counties" WHERE "counties"."name" = 'Fermanagh' LIMIT 1
   (0.1ms)  BEGIN
  SQL (20.5ms)  INSERT INTO "delayed_jobs" ("attempts", "created_at", "failed_at", "handler", "last_error", "locked_at", "locked_by", "priority", "run_at", "updated_at") VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10) RETURNING "id"  [["attempts", 0], ["created_at", Mon, 30 May 2011 21:19:25 UTC +00:00], ["failed_at", nil], ["handler", "--- 

# serialized object omitted for conciseness

nmethod_name: :refresh_listings_in_the_county_without_delay\nargs: []\n\n"], ["last_error", nil], ["locked_at", nil], ["locked_by", nil], ["priority", 0], ["run_at", Mon, 30 May 2011 21:19:25 UTC +00:00], ["updated_at", Mon, 30 May 2011 21:19:25 UTC +00:00]]
   (0.9ms)  COMMIT
  Delayed::Backend::ActiveRecord::Job Load (0.4ms)  SELECT "delayed_jobs".* FROM "delayed_jobs" WHERE (locked_by = 'host:David-Tuites-MacBook-Pro.local pid:7743' AND locked_at > '2011-05-30 17:19:32.116511') LIMIT 1
   (0.1ms)  BEGIN
  SQL (0.3ms)  DELETE FROM "delayed_jobs" WHERE "delayed_jobs"."id" = $1  [["id", 42]]
   (0.4ms)  COMMIT

As you can see, it seems to just inset a job and then delete it straight away?如您所见,它似乎只是插入一个工作然后立即将其删除? This scraping method should take at least a few minutes.这种刮擦方法至少需要几分钟。

The worst part is, it was working perfectly last night and I can't think of a single thing I'm doing differently.最糟糕的是,昨晚它工作得很好,我想不出我正在做的任何不同的事情。 I tried fixing the gem to a previous version incase it was updated recently but doesn't seem to have fixed the problem.我尝试将 gem 修复到以前的版本,以防它最近更新但似乎没有解决问题。

Any ideas?有任何想法吗?

Have you configured your delayed job to delete failed jobs?您是否已将延迟作业配置为删除失败的作业? Look for the following setting in your initializer:在初始化程序中查找以下设置:
Delayed::Worker.destroy_failed_jobs = true

If yes then set it to false and look into the delayed_jobs table for the exception due to which it failed and debug further.如果是,则将其设置为 false 并查看delayed_jobs 表以查找导致失败的异常并进一步调试。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM