简体   繁体   中英

Google Cloud Storage access via service account

I've been repetitively hitting my head against the proverbial brick wall of GCP's Storage API.

I'm trying to apply the django-storages module to connect with a GCP bucket for my static files and anything else I want to use it for in the future.

According to the django-storages documentation ( https://django-storages.readthedocs.io/en/latest/backends/gcloud.html#usage ), if you are running in the GCP virtual environment, you set your service account to have Storage permissions via the IAM interface and everything should work like tickety-boo.

So, my GCP cloud build runner builds the docker images then runs python manage.py migrate and python manage.py collectstatic before deploying my docker image to CloudRun. The build runner uses a service account called XXXX@cloudbuild.gserviceaccount.com , so going into IAM, I add the “Cloud storage – Storage admin” role, and just to be sure, I also add the “Cloud storage – Storage object admin” role.

Now I trigger a re-run of my cloudbuild and... at the migrate stage I receive the error:

...
Step #2 - "apply migrations":   File "/usr/local/lib/python3.8/importlib/__init__.py", line 127, in import_module
Step #2 - "apply migrations":     return _bootstrap._gcd_import(name[level:], package, level)
Step #2 - "apply migrations":   File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
Step #2 - "apply migrations":   File "<frozen importlib._bootstrap>", line 991, in _find_and_load
Step #2 - "apply migrations":   File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
Step #2 - "apply migrations":   File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
Step #2 - "apply migrations":   File "<frozen importlib._bootstrap_external>", line 843, in exec_module
Step #2 - "apply migrations":   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
Step #2 - "apply migrations":   File "/src/lang/urls.py", line 20, in <module>
Step #2 - "apply migrations":     re_path('favicon.ico$', RedirectView.as_view(url=staticfiles_storage.url('images/apple_touch_icon.png'), permanent=False)),
Step #2 - "apply migrations":   File "/usr/local/lib/python3.8/site-packages/storages/backends/gcloud.py", line 290, in url
Step #2 - "apply migrations":     return blob.generate_signed_url(
Step #2 - "apply migrations":   File "/usr/local/lib/python3.8/site-packages/google/cloud/storage/blob.py", line 620, in generate_signed_url
Step #2 - "apply migrations":     return helper(
Step #2 - "apply migrations":   File "/usr/local/lib/python3.8/site-packages/google/cloud/storage/_signing.py", line 550, in generate_signed_url_v4
Step #2 - "apply migrations":     ensure_signed_credentials(credentials)
Step #2 - "apply migrations":   File "/usr/local/lib/python3.8/site-packages/google/cloud/storage/_signing.py", line 52, in ensure_signed_credentials
Step #2 - "apply migrations":     raise AttributeError(
Step #2 - "apply migrations": AttributeError: you need a private key to sign credentials.the credentials you are currently using <class 'google.auth.compute_engine.credentials.Credentials'> just contains a token. see https://googleapis.dev/python/google-api-core/latest/auth.html#setting-up-a-service-account for more details.
Finished Step #2 - "apply migrations"

Huh. I can't seem to authenticate via service worker.

Using code from the google example tutorial on django, I have the following line in my settings.py:

credentials, project_id = google.auth.default()

But I don't do anything with the credentials variable returned. It seems to me the documentation is a little sparse online as to how to access buckets via service accounts. Any insights?

I found a user with a similar issue: https://pnote.eu/notes/django-app-engine-user-uploaded-files/

Appears that the problem occurs for buckets that have a bucket access policy that is Uniform instead of fine-grained . The author of the above article lodged an issue with django-storage and a fix was eventually merged in. There is now a "Note" box in the documentation that I missed that states:

GS_DEFAULT_ACL: When using this setting, make sure you have fine-grained access control enabled on your bucket, as opposed to Uniform access control, or else, file uploads will return with HTTP 400. If you already have a bucket with Uniform access control set to public read, please keep GS_DEFAULT_ACL to None and set GS_QUERYSTRING_AUTH to False.

So in short, the solution is to add to your settings.py file:

GS_QUERYSTRING_AUTH = False

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM