简体   繁体   English

如何使用python下载谷歌云平台上文件夹内的文件?

[英]How can i download the files inside a folder on google cloud platform using python?

from google.cloud import storage
client = storage.Client()
bucket = client.get_bucket([bucket_name])
blob = bucket.get_blob([path to the .txt file])
blob.download_to_filename([local path to the downloaded .txt file])

How can i adjust my python code to add something like for filename in os.listdir(path): to just copy all the files in a certain folder on there locally我如何调整我的 python 代码以for filename in os.listdir(path):添加类似for filename in os.listdir(path):只需将某个文件夹中的所有文件复制到本地

First of all, I think it is interesting to highlight that Google Cloud Storage uses a flat name space, and in fact the concept of "directories" does not exist, as there is no hierarchical file architecture being stored in GCS.首先,我认为值得强调的是,Google Cloud Storage 使用平面命名空间,实际上并不存在“目录”的概念,因为 GCS 中没有存储分层文件架构。 More information about how directories work can be found in the documentation, so it is a good read if you are interested in this topic.有关目录如何工作的更多信息可以在文档中找到,因此如果您对这个主题感兴趣,这是一个很好的阅读。

That being said, you can use a script such as the one I share below, in order to download all files in a "folder" in GCS to the same folder in your local environment.话虽如此,您可以使用我在下面分享的脚本,以便将 GCS 中“文件夹”中的所有文件下载到本地环境中的同一文件夹中。 Basically, the only important addition a part from your own code is that the bucket.list_blobs() method is being called, with the prefix field pointing to the folder name, in order to look for blobs which only match the folder-pattern in their name.基本上,您自己的代码中唯一重要的添加部分是调用bucket.list_blobs()方法prefix字段指向文件夹名称,以便查找仅与文件夹模式匹配的 blob姓名。 Then, you iterate over them, discard the directory blob itself (which in GCS is just a blob with a name ending in "/" ), and download the files.然后,您遍历它们,丢弃目录 blob 本身(在 GCS 中它只是名称以"/"结尾的 blob),然后下载文件。

from google.cloud import storage
import os

# Instantiate a CGS client
client=storage.Client()
bucket_name= "<YOUR_BUCKET_NAME>"

# The "folder" where the files you want to download are
folder="<YOUR_FOLDER_NAME>/"

# Create this folder locally
if not os.path.exists(folder):
    os.makedirs(folder)

# Retrieve all blobs with a prefix matching the folder
bucket=client.get_bucket(bucket_name)
blobs=list(bucket.list_blobs(prefix=folder))
for blob in blobs:
    if(not blob.name.endswith("/")):
        blob.download_to_filename(blob.name)

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何使用 python 从 google-cloud-platform 下载我的数据? - how can i download my data from google-cloud-platform using python? 如何在 Google 云平台 gcp 上托管 python 脚本? - How can I host a python script on Google cloud platform gcp? Google Cloud Platform:我如何获得使用Python将对象放入Google Cloud Store的签名URL - Google Cloud Platform: How can I get a signed URL for putting an object to Google Cloud Store with Python Google drive api python - 如何将文件夹或文件夹中的所有文件下载到特定的本地目标路径 - Google drive api python - How to download all the files inside a folder or the folder to a specific local destination path 如何在 Google Cloud Platform、App Engine 中执行 pip install -r requirements.txt(不使用 Gitlab CI)? - How can I execute pip install -r requirements.txt inside Google Cloud Platform, App Engine (without using Gitlab CI)? 我可以从内部文件夹(子文件)dropbox python 下载文件吗? - Can I download files from inside folder (Sub files) dropbox python? 如何使用 python 客户端 api 从 GCS 云存储桶下载文件夹中的所有文件? - How to download all files in a folder from GCS cloud bucket using python client api? 我可以在Google Cloud Platform的Python应用程序中运行Webpack吗? - Can I run Webpack in a Python app on Google Cloud Platform? 如何使用 python 以编程方式从谷歌驱动器下载特定文件 - How can I download specific files from google drive programmatically using python 使用适用于Python的Google Cloud Client库从存储桶下载文件 - Download files from bucket using Google Cloud Client Library for Python
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM