简体   繁体   中英

How to ensure Django model saves before Celery task execution?

I have a Django 1.11 + MySQL + Celery 4.1 project where a view creates a new user record, and then kicks off a Celery task to perform additional long-running actions in relation to it.

The typical problem in this case is ensuring that the user creation is committed to the database before the Celery task executes. Otherwise, there's a race condition, and the task may try and access a record that doesn't exit if it executes before the transaction commits.

The way I had learned to fix this was to always wrap the record creation in a manual transaction or atomic block, and then trigger the Celery task after that. eg

def create_user():
    with transaction.atomic():
        user = User.objects.create(username='blah')
    mytask.apply_async(args=[user.id])

@task
def mytask(user_id):
    user = User.objects.get(id=user_id)
    do_stuff(user)

However, I still occasionally see the error DoesNotExist: User matching query does not exist in my Celery worker logs, implying my task is sometimes executing before the user record gets committed.

Is this not the correct strategy or am I not implementing it correctly?

I believe a post_save signal would be more appropriate for what you're trying to do: https://docs.djangoproject.com/en/1.11/ref/signals/#post-save . This signal sends a created argument as a boolean, making it easy to operate only on object creation.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM