简体   繁体   English

如何使用谷歌云Python API将目录复制到谷歌云存储?

[英]How to copy a directory to google cloud storage using google cloud Python API?

The following function serves well for copying a single file to the google cloud storage. 以下功能适用于将单个文件复制到Google云端存储。

#!/usr/bin/python3.5
import googleapiclient.discovery

from google.cloud import storage

def upload_blob(bucket_name, source_file_name, destination_blob_name, project):
  storage_client = storage.Client(project=project)
  bucket = storage_client.get_bucket(bucket_name)
  blob = bucket.blob(destination_blob_name)

blob.upload_from_filename(source_file_name)

print('File {} uploaded to {}.'.format(
    source_file_name,
    destination_blob_name))

Now instead of giving a filename, i tried inputting a directory name, upload_blob('mybucket','/data/inputdata/', 'myapp/inputdata/','myapp') but then i get this error: 现在我没有给出文件名,而是尝试输入目录名, upload_blob('mybucket','/data/inputdata/', 'myapp/inputdata/','myapp')但后来我收到了这个错误:

AttributeError: 'str' object has no attribute 'read' AttributeError:'str'对象没有属性'read'

Do i need to give any additional parameters when calling the function blob.upload_from_file() to copy a directory? 在调用函数blob.upload_from_file()复制目录时,是否需要提供任何其他参数?

Uploading more than one file at a time is not a built-in feature of the API. 一次上传多个文件不是API的内置功能。 You can either copy several files in a loop, or you can use the command-line utility instead, which can copy whole directories. 您可以在循环中复制多个文件,也可以使用命令行实用程序,它可以复制整个目录。

Here's some code you can use to accomplish this: 以下是您可以用来完成此操作的一些代码:

import os
import glob

def copy_local_directory_to_gcs(local_path, bucket, gcs_path):
    """Recursively copy a directory of files to GCS.

    local_path should be a directory and not have a trailing slash.
    """
    assert os.path.isdir(local_path)
    for local_file in glob.glob(local_path + '/**'):
        if not os.path.isfile(local_file):
            continue
        remote_path = os.path.join(gcs_path, local_file[1 + len(local_path) :])
        blob = bucket.blob(remote_path)
        blob.upload_from_filename(local_file)

Use it like so: 像这样使用它:

copy_local_directory_to_gcs('path/to/foo', bucket, 'remote/path/to/foo')

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM