简体   繁体   中英

gcloud auth login issue 403 access denied when trying to run bigquery

EDIT: seems like this is only an issue for tables based on google drive files. I am using auth login, the jobs show up in bigquery UI as being run by me as the owner, but return this error despite the fact I should have full perms for this data and the files it is based on.

I am having an issue getting gcloud to run bigquery and it is returning the error:

google.api_core.exceptions.Forbidden: 403 Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.

I have tried all the various login commands:

gcloud auth application-default login

gcloud auth login

I then get taken to the browser and grant access permissions, everything seems fine, but I still get the same error.

When running the following my account shows up fine, I can see my login credentials as a JSON file, but I just cannot understand why it is saying I do not have access.

 gloud auth list

I can go into bigquery and see the query being requested by myself as the owner, and then I can run the same query in the bigquery console, but it just does not work when requested via the script.

Here is the code I am running. Multiple people have run this code, confirms it works, and they have not done anything different to me.

I assume I have done something wrong with the gcloud login but struggling to understand what.

 import streamlit as st import pandas as pd from datetime import datetime as dt from google.cloud import bigquery def load_data(query_file): with open(query_file, "r+") as query_file: query = query_file.read() client = bigquery.Client(project = "my-data") df = client.query(query).result().to_dataframe() return df def load_current_es(ttl=60 * 60 * 540): """Get current ES names and pod Args: None Returns: pd.Dataframe """ columns = {"full_name": "str"} df = load_data("data/current_es.sql") assert len(df) > 0, "Dataframe for current ES is empty" # correct data types df = df.astype(dtype=columns) return df df = load_current_es() print(df.head())

So in the end this turned out to be a gcloud authentification issue.

I had run gcloud auth login and gcloud auth application-default login --scopes https://www.googleapis.com/auth/drive,https://www.googleapis.com/auth/cloud-platform,https://www.googleapis.com/auth/bigquery and thought this had granted drive access, but was still receiving error 403 drive access denied on tables linked to drive files. It was strange as it showed in the BQ ui that the jobs were being called with me as the owner, but perms did not seem to be being applied as expected.

The solution was to also run gcloud auth login --enable-gdrive-access and then it now gives access to all tables.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM