简体   繁体   中英

Access BigQuery data from Jupyter Notebook in AI Platform Google Cloud

I am trying to get access to the data stored in BigQuery from Jupyter Notebook in AI Platform on Google cloud platform. First, I tried the following code:

from google.cloud import bigquery
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_file(r'\local_path\gcpcred.json')

project_id = 'my-bq'
client = bigquery.Client(credentials= credentials,project=project_id)

The authentication credentials are stored in a json file named gcpcred on the local machine but this gives me an error saying

FileNotFoundError: [Errno 2] No such file or directory: '\local_path\gcpcred.json

I thought that since I am running this in AI Platform(on the cloud itself), I would not have to use this API and authenticate.

So I simply wrote:

%%bigquery 
SELECT * FROM  `project.dataset.table`  LIMIT 1000

I got an error saying

ERROR: 403 Access Denied: User does not have access to the table

How do I access the table? Please help

Seems like the service account assosiated with jupyter notebooks doesn't have enough privilage to access bigquery. You can update it in IAM service Account section with required privilages. The links Bellow will provide further clarification:

Visualizing BigQuery data in a Jupyter notebook

Getting started with authentication

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM