简体   繁体   中英

Unable to import BigQuery data into GCP AI Notebook

I had previously used the code below to import data from bigquery to an AI notebook instance in GCP. For unknown reasons it stopped working and gives me the following error: "ImportError: cannot import name 'bigquery_storage_v1beta1' from 'google.cloud' (unknown location)". It may have started after I began cloning my github repository (I delete the AI notebook instance to avoid a charge), but I can't be sure. Thoughts?

import pandas as pd

pd.options.display.max_rows = 200 
from google.cloud import bigquery
from google.cloud import storage

import numpy as np

query="""
SELECT *
FROM
  [table_name] """

df = bigquery.Client().query(query).to_dataframe()
df.head()

The solution was updating the packages with the following commands:

First:

!sudo /opt/conda/bin/conda install -c conda-forge google-cloud-bigquery google-cloud-bigquery-storage pandas pyarrow --yes

Then:

%pip install google-cloud-bigquery

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM