简体   繁体   English

加载操作中的 BigQuery 错误:找不到 URI

[英]BigQuery error in load operation: URI not found

I have, in the same GCP project, a BigQuery dataset and a cloud storage bucket, both within the region us-central1.在同一个 GCP 项目中,我在 us-central1 区域内有一个 BigQuery 数据集和一个云存储桶。 The storage bucket has a single parquet file located in it.存储桶中有一个 parquet 文件。 When I run the below command:当我运行以下命令时:

bq load \
--project_id=myProject --location=us-central1 \
--source_format=PARQUET \
myDataSet:tableName \
gs://my-storage-bucket/my_parquet.parquet

It fails with the below error:它失败并出现以下错误:

BigQuery error in load operation: Error processing job '[job_no]': Not found: URI gs://my-storage-bucket/my_parquet.parquet

Removing the --project_id or --location tags don't affect the outcome.删除--project_id--location标签不会影响结果。

Figured it out - the documentation is incorrect, I actually had to declare the source as gs://my-storage-bucket/my_parquet.parquet/part* and it loaded fine弄清楚了-文档不正确,实际上我必须将源声明为gs://my-storage-bucket/my_parquet.parquet/part*并且加载正常

There has been some internal issues with BigQuery on 3rd March and it has been fixed now. BigQuery 在 3 月 3 日出现了一些内部问题,现已修复。

I have confirmed and used the following command to upload successfully a parquet file from Cloud Storage to BigQuery Table using bq command:我已经确认并使用以下命令使用 bq 命令成功地将 parquet 文件从 Cloud Storage 上传到 BigQuery Table:

bq load --project_id=PROJECT_ID \
--source_format=PARQUET \
DATASET.TABLE_NAME gs://BUCKET/FILE.parquet

Please note that according to the BigQuery Official Documentation , you have to declare the name of the table as following DATASET.TABLE_NAME ( In the post, I can see: instead of. )请注意,根据BigQuery 官方文档,您必须将表的名称声明为以下DATASET.TABLE_NAME (在帖子中,我可以看到:而不是。)

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 加载操作中的 GCP Bigquery 错误:字节丢失 - GCP Bigquery Error in Load Operation: Bytes are Missing 加载操作中的BigQuery错误:叶字段总数过多 - BigQuery error in load operation: Too many total leaf fields Bigquery bq 加载内部错误 - Bigquery bq load Internal Error BigQuery 命令“bq mk”抛出错误“mk 操作中的 BigQuery 错误:读取架构时出错:” - BigQuery command 'bq mk' throwing the error "BigQuery error in mk operation: Error reading schema:" BigQuery加载失败,几乎每次加载尝试都出现后端错误 - BigQuery load failing with backend error on nearly every load attempt Bigquery DML 操作 - Bigquery DML operation bq extract - 提取操作中的 BigQuery 错误:发生内部错误,请求无法完成 - bq extract - BigQuery error in extract operation: An internal error occurred and the request could not be completed BigQuery 中的 BETWEEN 操作没有结果 - BETWEEN operation in BigQuery givies no results 如何从 GCS 将数据加载到 BigQuery(使用 load_table_from_uri 或 load_table_from_dataframe)而不复制 BQ 表中的现有数据 - how to load data into BigQuery from GCS (using load_table_from_uri or load_table_from_dataframe) without duplicating existing data in the BQ table bq cmd 查询 Google Sheet Table 出现“拒绝访问:BigQuery BigQuery:未找到具有 Google Drive 范围的 OAuth 令牌”错误 - bq cmd query Google Sheet Table occur "Access Denied: BigQuery BigQuery: No OAuth token with Google Drive scope was found" Error
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM