简体   繁体   English

加载操作中的BigQuery错误:叶字段总数过多

[英]BigQuery error in load operation: Too many total leaf fields

I am uploading a big file into BigQuery, because it is too slow to operate on my own PC. 我正在将一个大文件上传到BigQuery,因为它太慢了,无法在我自己的PC上运行。 bq --location=EU load --field_delimiter='\\t' --skip_leading_rows=1 --source_format=CSV single_cells.retinal_bipolar gs://single_cells/retinal-bipolar-neuron-drop-seq/exp_matrix.txt ./schema.json However, I got an error: bq --location=EU load --field_delimiter='\\t' --skip_leading_rows=1 --source_format=CSV single_cells.retinal_bipolar gs://single_cells/retinal-bipolar-neuron-drop-seq/exp_matrix.txt ./schema.json但是,出现错误:

BigQuery error in load operation: Too many total leaf fields: 27500 加载操作中的BigQuery错误:叶字段总数过多:27500

Indeed, the data has 27500 columns; 的确,数据有27500列。 doesn't BigQuery allow this? BigQuery不允许这样做吗?

The Maximum columns per table value is 10,000 which means that the error message was thrown since the load job exceeded the maximum number of fields allowed in the schema. 每个表最大列数为10,000,这意味着由于装入作业超出了架构中允许的最大字段数,因此引发了错误消息。

Based on this, an available workaround is to split the table up into smaller tables in order to reduce the number of fields contained; 基于此,一种可行的解决方法是将表拆分为较小的表,以减少所包含的字段数。 In this way, you will be able to avoid this issue. 这样,您将可以避免此问题。

Another workaround - as Elliott pointed - you can load your file into BigQuery table that is set with schema having ONLY one column of type STRING. 另一个解决方法-正如Elliott指出的-您可以将文件加载到BigQuery表中,该表是使用仅具有STRING类型的列的架构设置的。 Then (assuming that file and row size does not exceed size limitations ) - you will be able to use all power of BigQuery! 然后(假设文件和行的大小不超过size limitations )-您将可以使用BigQuery的所有功能!

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM