简体   繁体   中英

Move file from /tmp folder to Google Cloud Storage bucket

I originally posted this question when I was having trouble getting my python cloud function to create and write to a new file. Since then I've managed to create a csv in the /tmp directory but am struggling to find a way to move that file into my bucket's folder where the original csv was uploaded.

Is it possible to do this? I've looked through the Google Cloud Storage docs and tried using the blob.download_to_filename() and bucket.copy_blob() methods but am currently getting the error: FileNotFoundError: [Errno 2] No such file or directory: 'my-project.appspot.com/my-folder/my-converted-file.csv'

Appreciate any help or advice!

to move that file into my bucket

Here is an example. Bear in mind:

  1. Don't copy and paste without thinking.
  2. The code snippet is only to show the idea - it won't work as is. Modifications are required to fit into your context and requirements.
  3. The _crc32sum function was not developed by me.
  4. I did not test the code. It is just from my head with copying some elements from different public sources.

Here is the code:


import base64
import crc32c
import os

from google.cloud import exceptions
from google.cloud import storage

# =====> ==============================
# a function to calculate crc32c hash
def _crc32sum(filename: str, blocksize: int = 65536) -> int:
    """Calculate the crc32c hash for a file with the provided name

    :param filename: the name of the file
    :param blocksize: the size of the block for the file reading
    :return: the calculated crc32c hash for the given file
    """
    checksum = 0
    with open(filename, "rb") as f_ref:
        for block in iter(lambda: f_ref.read(blocksize), b""):
            checksum = crc32c.crc32(block, checksum)
    return checksum & 0xffffffff
# =====> ==============================

# use the default project in the client initialisation
CS = storage.Client()

lcl_file_name = "/tmp/my-local-file.csv"

tgt_bucket_name = "my-bucket-name"
tgt_object_name = "prefix/another-prefix/my-target-file.csv"

# =====> ==============================
# =====> ==============================
# =====> the process strats here

# https://googleapis.dev/python/storage/latest/_modules/google/cloud/storage/client.html#Client.lookup_bucket
gcs_tgt_bucket_ref = CS.lookup_bucket(tgt_bucket_name)

# check if the target bucket does exist
if gcs_tgt_bucket_ref is None:
    # handle incorrect bucket name or its absence
    # most likely we are to finish the execution here rather than 'pass'
    pass

# calculate the hash for the local file
lcl_crc32c = _crc32sum(lcl_file_name)
base64_crc32c = base64.b64encode(lcl_crc32c.to_bytes(
    length=4, byteorder='big')).decode('utf-8')

# check if the file/object in the bucket already exists
# https://googleapis.dev/python/storage/latest/_modules/google/cloud/storage/bucket.html#Bucket.blob
gcs_file_ref = gcs_tgt_bucket_ref.blob(tgt_object_name)

# https://googleapis.dev/python/storage/latest/_modules/google/cloud/storage/blob.html#Blob.exists
if gcs_file_ref.exists():
    gcs_file_ref.reload()
    # compare crc32c hashes - between the local file and the gcs file/object
    if base64_crc32c != gcs_file_ref.crc32c:
        # the blob file/object in the GCS has a different hash
        # the blob file/object should be deleted and a new one to be uploaded
        # https://googleapis.dev/python/storage/latest/_modules/google/cloud/storage/blob.html#Blob.delete
        gcs_file_ref.delete()
    else:
        # the file/object is already in the bucket
        # most likely we are to finish the execution here rather than 'pass'
        pass

# upload file to the target bucket
# reinit the reference in case the target file/object was deleted
gcs_file_ref = gcs_tgt_bucket_ref.blob(tgt_file_name)
gcs_file_ref.crc32c = base64_crc32c

with open(lcl_file_name, 'rb') as file_obj:
    try:
        gcs_file_ref.metadata = {
            "custom-metadata-key": "custom-metadata-value"
        }
        # https://googleapis.dev/python/storage/latest/_modules/google/cloud/storage/blob.html#Blob.upload_from_file
        gcs_file_ref.upload_from_file(
            file_obj=file_obj, content_type="text/csv", checksum="crc32c")
    except exceptions.GoogleCloudError as gc_err:
        # handle the exception here
        # don't forget to delete the local file if it is not required anymore
        # most likely we are to finish the execution here rather than 'pass'
        pass

# clean behind
if lcl_file_name and os.path.exists(lcl_file_name):
    os.remove(lcl_file_name)

# =====> the process ends here
# =====> ==============================

Let me know if there are significant mistakes, and I modify the example.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM