简体   繁体   English

使用Google Cloud Storage python客户端进行批量请求

[英]Batch request with Google Cloud Storage python client

I can't find any examples on how to use the python google cloud storage's batch functionality. 我找不到任何关于如何使用python google云存储的批处理功能的示例。 I see it exists here . 我看到它存在于此

I'd love a concrete example. 我喜欢一个具体的例子。 Let's say I want to delete a bunch of blobs with a given prefix. 假设我想删除一堆带有给定前缀的blob。 I'd start getting the list of blobs as follows 我开始按如下方式获取blob列表

from google.cloud import storage

storage_client = storage.Client()
bucket = storage_client.get_bucket('my_bucket_name')
blobs_to_delete = bucket.list_blobs(prefix="my/prefix/here")

# how do I delete the blobs in blobs_to_delete in a single batch?

# bonus: if I have more than 100 blobs to delete, handle the limitation
#        that a batch can only handle 100 operations

TL;DR - Just send all the requests within the batch() context manager (available in the google-cloud-python library) TL; DR - 只需发送batch()上下文管理器中的所有请求(可在google-cloud-python库中找到)

Try this example: 试试这个例子:

from google.cloud import storage

storage_client = storage.Client()
bucket = storage_client.get_bucket('my_bucket_name')
# Accumulate the iterated results in a list prior to issuing
# batch within the context manager
blobs_to_delete = [blob for blob in bucket.list_blobs(prefix="my/prefix/here")]

# Use the batch context manager to delete all the blobs    
with storage_client.batch():
    for blob in blobs_to_delete:
        blob.delete()

You only need to worry about the 100 items per batch if you're using the REST APIs directly. 如果您直接使用REST API,则只需担心每批100个项目。 The batch() context manager automatically takes care of this restriction and will issue multiple batch requests if needed. batch()上下文管理器自动处理此限制,并在需要时发出多个批处理请求。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM