简体   繁体   中英

Cloud Storage: how to setup service account credentials for python boto library

I'm following this tutorial to upload a file to a bucket I've manually created: https://cloud.google.com/storage/docs/xml-api/gspythonlibrary

I seem to have trouble with setting up the credentials both as service account or user account. I want to use this in a web server so ideally it should be setup with service account.

I created an API using API Manager in Console and downloaded the JSON. Meanwhile my gcloud auth is setup with my OAUTH login. I did try gsutil config -e and got error:

CommandException: OAuth2 is the preferred authentication mechanism with the Cloud SDK. Run "gcloud auth login" to configure authentication, unless you want to authenticate with an HMAC access key and secret, in which case run "gsutil config -a".

I also tried to authenticate the service account using: gcloud auth activate-service-account --key-file <json file>

but still no luck with enabling access with python boto. I also copied the ID and Key from ~/.config/gcloud/ to ~/.boto but that didn't work either. I'm not sure how am I supposed to setup the authentication for the python server to access cloud storage. I'm not using App Engine but Cloud Compute to setup the webserver.

Here's my source code:

import boto
import gcs_oauth2_boto_plugin
import os
import shutil
import StringIO
import tempfile
import time

CLIENT_ID = 'my client id from ~/.config/gcloud/credentials'
CLIENT_SECRET = 'my client secret from ~/.config/gcloud/credentials'
gcs_oauth2_boto_plugin.SetFallbackClientIdAndSecret(CLIENT_ID, CLIENT_SECRET)

uri = boto.storage_uri('', 'gs')
project_id = 'my-test-project-id'
header_values = {"x-test-project-id": project_id}
# If the default project is defined, call get_all_buckets() without arguments.
for bucket in uri.get_all_buckets(headers=header_values):
    print bucket.name

Most recent error:

Traceback (most recent call last):
  File "upload/uploader.py", line 14, in <module>
    for bucket in uri.get_all_buckets(headers=header_values):
  File "/Users/ankitjain/dev/metax/venv/lib/python2.7/site-packages/boto/storage_uri.py", line 574, in get_all_buckets
    return conn.get_all_buckets(headers)
  File "/Users/ankitjain/dev/metax/venv/lib/python2.7/site-packages/boto/s3/connection.py", line 444, in get_all_buckets
    response.status, response.reason, body)
boto.exception.GSResponseError: GSResponseError: 403 Forbidden
<?xml version='1.0' encoding='UTF-8'?><Error><Code>InvalidSecurity</Code><Message>The provided security credentials are not valid.</Message><Details>Incorrect Authorization header</Details></Error>

Okay after some more experiments I gave up on using GCE library for uploading data to Cloud Storage. I actually found using AWS boto to upload to cloud storage works much better. All I had to do was specify Google's host in s3 library:

conn = boto.connect_s3(app.config['S3_KEY'], app.config['S3_SECRET'], "c.storage.googleapis.com")
bucket = conn.get_bucket(app.config['S3_BUCKET'], validate=False)

I used the HMAC credentials generated the way described in the Google Docs. Reference: https://cloud.google.com/storage/docs/migrating

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM