简体   繁体   中英

Django rq to do batch db insert

How can I enqueue a function which will run a long time?

I want to do the following:

def batch_insert(data): rows.append(MyModel(*data)) if len(rows) > 1000: MyModel.objects.bulk_create(rows)

  1. Make sure that you have the django-rq app installed and registered in your project's settings.py . You will also need the following setting set:

     RQ_QUEUES = { "default" : { "USE_REDIS_CACHE" : "jobs" }, } 

    and the following added to your CACHES setting:

     CACHES = { ... { "jobs": { "BACKEND" : "django_redis.cache.RedisCache", "LOCATION" : "{{YOUR REDIS SERVER ADDRESS}}", "OPTIONS" : { "CLIENT_CLASS": "django_redis.client.DefaultClient", } } } } 
  2. Create a jobs.py file in your app, with the job you'd like to enqueue:

     from myapp.models import MyModel from django_rq import job @job def batch_insert(data): rows = [] rows.append(MyModel(*data)) if len(rows) > 1000: MyModel.objects.bulk_create(rows) else: for row in rows: row.save() 
  3. Import your job into a view that triggers it

     from myapp.jobs import batch_insert trigger_batch_insert(request): sample_data = # Define your data here batch_insert.delay(sample_data) # This runs the job, instead of # running it synchronously return HttpResponse("Job running!") 
  4. Make sure you hook up the view to a URL route in your urls.py
  5. Make sure your RQ workers are running:

     $ python manage.py rqworker default 
  6. Send the view a request, and check the console running the RQ workers to see if it worked :)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM