How can I enqueue a function which will run a long time?
I want to do the following:
def batch_insert(data): rows.append(MyModel(*data)) if len(rows) > 1000: MyModel.objects.bulk_create(rows)
Make sure that you have the django-rq
app installed and registered in your project's settings.py
. You will also need the following setting set:
RQ_QUEUES = { "default" : { "USE_REDIS_CACHE" : "jobs" }, }
and the following added to your CACHES
setting:
CACHES = { ... { "jobs": { "BACKEND" : "django_redis.cache.RedisCache", "LOCATION" : "{{YOUR REDIS SERVER ADDRESS}}", "OPTIONS" : { "CLIENT_CLASS": "django_redis.client.DefaultClient", } } } }
Create a jobs.py
file in your app, with the job you'd like to enqueue:
from myapp.models import MyModel from django_rq import job @job def batch_insert(data): rows = [] rows.append(MyModel(*data)) if len(rows) > 1000: MyModel.objects.bulk_create(rows) else: for row in rows: row.save()
Import your job into a view that triggers it
from myapp.jobs import batch_insert trigger_batch_insert(request): sample_data = # Define your data here batch_insert.delay(sample_data) # This runs the job, instead of # running it synchronously return HttpResponse("Job running!")
urls.py
Make sure your RQ workers are running:
$ python manage.py rqworker default
Send the view a request, and check the console running the RQ workers to see if it worked :)
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.