[英]GCP Datastore Python - InvalidArgument: 400 A non-transactional commit may not contain multiple mutations affecting the same entity
I have a Cloud Run application that writes to records to Cloud Datastore periodically.我有一个 Cloud Run 应用程序,它会定期将记录写入 Cloud Datastore。 Each record has the same key every time, so I am updating the record when writing.
每条记录每次都有相同的键,所以我在写入时更新记录。
The problem is that this application gets a lot of requests and hence autoscales.问题是这个应用程序收到了很多请求,因此会自动缩放。 When all these autoscaled instances write to the Cloud Datastore, sometimes they all attempt it at the same time and that is when I see exception mentioned below:
当所有这些自动缩放的实例写入 Cloud Datastore 时,有时它们都会同时尝试,这就是我看到下面提到的异常时:
google.api_core.exceptions.InvalidArgument: 400 A non-transactional commit may not contain multiple mutations affecting the same entity.
Below is a skeleton code of the upload function.下面是上传功能的骨架代码。
def datastore_upload(records: list):
client = datastore.Client()
kind = "some_kind"
entities = []
for record in records:
name = record['name']
key = client.key(kind, name)
task = datastore.Entity(key=key)
task['x'] = record['x']
task['y'] = record['y']
entities.append(task)
client.put_multi(entities)
You can easily work around this issue by tracking which entities are being updated, eg您可以通过跟踪正在更新的实体来轻松解决此问题,例如
def datastore_upload(records: list):
client = datastore.Client()
kind = "some_kind"
entities = {}
for record in records:
name = record['name']
key = client.key(kind, name)
task = datastore.Entity(key=key)
task['x'] = record['x']
task['y'] = record['y']
# Keep the last update for each task.
entities[name] = task
client.put_multi(list(entities.values()))
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.