简体   繁体   English

无法将数据从 CoLab 加载到 Bigquery

[英]unable to Load data from CoLab to Bigquery

I am trying to load 5 SQL files to 5 different tables in bigquery to visualize the data in data studio.i have uploaded these files in CoLab's storage section and authorized the project.我正在尝试将 5 个 SQL 文件加载到 bigquery 中的 5 个不同表,以可视化数据工作室中的数据。我已将这些文件上传到 CoLab 的存储部分并授权该项目。

datasets = [r"/file1.sql",r"/file2.sql",,r"/file3.sql",r"/file4.sql",,r"/file5.sql"]
f = open(datasets[1], "r")
data=f.read()
data = data.replace('\n','')
import pandas as pd

df = pd.io.gbq.read_gbq('''data''', project_id='newproject1', dialect='standard')

df.head()

df.to_gbq('dataset1.testtable1','newproject1',chunksize=None,reauth=False,if_exists='append')
df.to_gbq('dataset1.testtable2','newproject1',chunksize=None,reauth=False,if_exists='append')
df.to_gbq('dataset1.testtable3','newproject1',chunksize=None,reauth=False,if_exists='append')
df.to_gbq('dataset1.testtable4','newproject1',chunksize=None,reauth=False,if_exists='append')
df.to_gbq('dataset1.testtable5','newproject1',chunksize=None,reauth=False,if_exists='append')

)

I get below error when I run the query.运行查询时出现以下错误。

The following traceback may be corrupted or invalid
The error message is: ('EOF in multi-line string', (1, 0))

---------------------------------------------------------------------------
BadRequest                                Traceback (most recent call last)
/usr/local/lib/python3.6/dist-packages/pandas_gbq/gbq.py in _download_results(self, query_job, max_results, progress_bar_type)
    549 
--> 550             query_job.result()
    551             # Get the table schema, so that we can list rows.

Also I want to know how to change python code to load the data to 5 respected tables.另外我想知道如何更改 python 代码以将数据加载到 5 个相关表。

I don't know if it is a typo or if it is the issue but try replacing:我不知道这是错字还是问题所在,但请尝试更换:

df.to_gbq(dataset1.testtable1','newproject1',chunksize=None,reauth=False,if_exists='append')
df.to_gbq(dataset1.testtable2','newproject1',chunksize=None,reauth=False,if_exists='append')
df.to_gbq(dataset1.testtable3','newproject1',chunksize=None,reauth=False,if_exists='append')
df.to_gbq(dataset1.testtable4','newproject1',chunksize=None,reauth=False,if_exists='append')
df.to_gbq(dataset1.testtable5','newproject1',chunksize=None,reauth=False,if_exists='append')

with

df.to_gbq('dataset1.testtable1','newproject1',chunksize=None,reauth=False,if_exists='append')
df.to_gbq('dataset1.testtable2','newproject1',chunksize=None,reauth=False,if_exists='append')
df.to_gbq('dataset1.testtable3','newproject1',chunksize=None,reauth=False,if_exists='append')
df.to_gbq('dataset1.testtable4','newproject1',chunksize=None,reauth=False,if_exists='append')
df.to_gbq('dataset1.testtable5','newproject1',chunksize=None,reauth=False,if_exists='append')

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 将数据从 BigQuery 加载到 Snowflake - Load data from BigQuery to Snowflake 无法将 json 文件中的数据加载到 bigquery,因为 json 在不同情况下具有相同的键 - Unable to load the data from json files to bigquery, as the jsons have same key with different cases 使用数据流将数据从 MySQL 加载到 BigQuery - Load data from MySQL to BigQuery using Dataflow 从 GCS 加载_Csv_data 到 Bigquery - Load_Csv_data from GCS to Bigquery Airflow:如何将数据从 REST API 加载到 BigQuery? - Airflow: How to load data from a REST API to BigQuery? 有什么好的方法可以快速将数据从 BigQuery 加载到 Clickhouse 中吗? - Is there a good way to load data from BigQuery into Clickhouse quickly? 如何在 BigQuery 中加载 shapefile? 有没有更简单的方法在 BigQuery 中上传多边形数据? - How to load a shapefile in BigQuery? Is there a simpler way to upload polygon data in BigQuery? Bigquery 在流式传输时无法加载所有数据 - Bigquery cannot load all data when streaming BigQuery 加载数据因列名错误而崩溃 - BigQuery Load data crash with bad column names 如何从 GCS 将数据加载到 BigQuery(使用 load_table_from_uri 或 load_table_from_dataframe)而不复制 BQ 表中的现有数据 - how to load data into BigQuery from GCS (using load_table_from_uri or load_table_from_dataframe) without duplicating existing data in the BQ table
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM