[英]Batch request with Google Cloud Storage python client
我找不到任何关于如何使用python google云存储的批处理功能的示例。 我看到它存在于此 。
我喜欢一个具体的例子。 假设我想删除一堆带有给定前缀的blob。 我开始按如下方式获取blob列表
from google.cloud import storage
storage_client = storage.Client()
bucket = storage_client.get_bucket('my_bucket_name')
blobs_to_delete = bucket.list_blobs(prefix="my/prefix/here")
# how do I delete the blobs in blobs_to_delete in a single batch?
# bonus: if I have more than 100 blobs to delete, handle the limitation
# that a batch can only handle 100 operations
TL; DR - 只需发送batch()
上下文管理器中的所有请求(可在google-cloud-python
库中找到)
试试这个例子:
from google.cloud import storage
storage_client = storage.Client()
bucket = storage_client.get_bucket('my_bucket_name')
# Accumulate the iterated results in a list prior to issuing
# batch within the context manager
blobs_to_delete = [blob for blob in bucket.list_blobs(prefix="my/prefix/here")]
# Use the batch context manager to delete all the blobs
with storage_client.batch():
for blob in blobs_to_delete:
blob.delete()
如果您直接使用REST API,则只需担心每批100个项目。 batch()
上下文管理器自动处理此限制,并在需要时发出多个批处理请求。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.