简体   繁体   English

如何清除scrapy工作列表?

[英]How I can clear scrapy jobs list?

如何清除scrapy工作列表?当我启动任何蜘蛛时,我有很多特定蜘蛛的工作,我知道如何杀死所有蜘蛛?阅读文档后,我完成了下一个代码,我在循环中运行:

cd = os.system('curl http://localhost:6800/schedule.json -d project=default -d spider=google > kill_job.text')
file = open('kill_job.text', 'r')
a = ast.literal_eval(file.read())
kill='curl http://localhost:6800/cancel.json -d project=default -d job={}'.format(a['jobid'])
pprint(kill)

cd = os.system(kill)

So, I do not know what is wrong with my first example, but I fixed problem with this: 因此,我不知道第一个示例出了什么问题,但是我解决了以下问题:

cd = os.system('curl http://localhost:6800/listjobs.json?project=projectname > kill_job.text')
file = open('kill_job.text', 'r')
a = ast.literal_eval(file.read())
b = a.values()
c = b[3]
for i in c:
    kill = 'curl http://localhost:6800/cancel.json -d project=projectname -d job={}'.format(i['id'])
    os.system(kill)

Took @kolas's script and updated it for python 3:获取了@kolas 的脚本并为 python 3 更新了它:

import json, os
PROJECT_NAME = "MY_PROJECT"

cd = os.system('curl http://localhost:6800/listjobs.json?project={} > kill_job.text'.format(PROJECT_NAME))
with open('kill_job.text', 'r') as f:
    a = json.loads(f.readlines()[0])

pending_jobs = list(a.values())[2]
for job in pending_jobs:
    job_id = job['id']
    kill = 'curl http://localhost:6800/cancel.json -d project={} -d job={}'.format(PROJECT_NAME, job_id)
    os.system(kill)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM