简体   繁体   中英

Not able to upload_from_filename files into GCP bucket using python script and GCP service account cred

• Installed Python 3.7.2 • Created GCP service account and given owner role to it, also enabled storage API and created a cloud storage bucket • Now I'm trying to upload files to GCP cloud storage folder using python script but I couldn't. But, by using the same structure, I'm able to create new cloud storage bucket and able to edit existing files in it • Here with have attached pythonscript

Ref used: https://googleapis.github.io/google-cloud-python/latest/storage/blobs.html https://cloud.google.com/storage/docs/uploading-objects#storage-upload-object-python

 from google.cloud import storage bucket_name='buckettest' source_file_name='D:/file.txt' source_file_name1='D:/jenkins structure.png' destination_blob_name='test/' def upload_blob(bucket_name, source_file_name, destination_blob_name): """Uploads a file to the bucket.""" client = storage.Client.from_service_account_json('D:\\gmailseviceaccount.json') bucket = client.create_bucket('bucketcreate') bucket = client.get_bucket(bucket_name) blob = bucket.blob(destination_blob_name) blob.upload_from_filename(source_file_name) blob.upload_from_filename(source_file_name1) print('File {} uploaded to {}.'.format( source_file_name, destination_blob_name)) if __name__ == '__main__': upload_blob(bucket_name, source_file_name, destination_blob_name) 

I was able to run your code and debug it. I will put what I used below and explain the changes I made.

As you did, I put my service account as Owner and was able to upload. I recommend following the best practices of least privileges when your done testing.

  1. I removed client.create_bucket since buckets are unique we shouldn't be hard coding bucket names to create. You can come up with a naming convention for your needs, however, for testing I removed it.
  2. I fixed the variable destination_blob_name since you were using it as a folder for the file to be placed. This would not work as GCS does not use folders, it instead just uses file names. What was happening is that you were actually "converting" your TXT files into a folder named 'test'. For a better understanding, I recommend looking through the documentation on How Sub-directories Work .

     from google.cloud import storage bucket_name='bucket-test-18698335' source_file_name='./hello.txt' destination_blob_name='test/hello.txt' def upload_blob(bucket_name, source_file_name, destination_blob_name): """Uploads a file to the bucket.""" client = storage.Client.from_service_account_json('./test.json') bucket = client.get_bucket(bucket_name) blob = bucket.blob(destination_blob_name) blob.upload_from_filename(source_file_name) print('File {} uploaded to {}.'.format( source_file_name, destination_blob_name)) if __name__ == '__main__': upload_blob(bucket_name, source_file_name, destination_blob_name) 

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM