简体   繁体   English

Google BigQuery 负载大小限制为 10485760 字节

[英]Google BigQuery Payload size limit of 10485760 bytes

We encountered an error while trying to stream data into bigquery table, it says: payload size limit of 10485760 bytes, anyone has any idea of it?我们在尝试将数据流式传输到 bigquery 表时遇到错误,它说:有效负载大小限制为 10485760 字节,有人知道吗? According to the third party integration vendor which we use to move data across from sql server to bigquery table, they advised it is an issue by bigquery?根据我们用来将数据从 sql server 移动到 bigquery 表的第三方集成供应商的说法,他们建议这是 bigquery 的问题?

Thanks.谢谢。 Best regards,最好的祝福,

BigQuery has some maximum limitations and also has some quotas policies as you can see here . BigQuery有一些最大限制,也有一些配额政策,您可以在此处看到。

The limitations for Streaming are: Streaming 的限制是:

  • If you do not populate the insertId field when you insert rows:如果在插入行时不填充 insertId 字段:

    • Maximum rows per second: 1,000,000每秒最大行数:1,000,000
    • Maximum bytes per second: 1 GB每秒最大字节数:1 GB
  • If you populate the insertId field when you insert rows:如果在插入行时填充 insertId 字段:

    • Maximum rows per second: 100,000每秒最大行数:100,000
    • Maximum bytes per second: 100 MB每秒最大字节数:100 MB
  • The following additional streaming quotas apply whether or not you populate the insertId field:无论您是否填充 insertId 字段,以下附加流配额均适用:

    • Maximum row size: 1 MB最大行大小:1 MB
    • HTTP request size limit: 10 MB HTTP 请求大小限制:10 MB
    • Maximum rows per request: 10,000 rows per request每个请求的最大行数:每个请求 10,000 行
    • insertId field length: 128 insertId 字段长度:128

I hope it helps我希望它有帮助

Indeed the streaming limit is 10MB per request.事实上,流媒体限制是每个请求 10MB。

Row size is 1MB according to https://cloud.google.com/bigquery/quotas根据https://cloud.google.com/bigquery/quotas ,行大小为 1MB

What you need to do is parallelize the streaming jobs.您需要做的是并行化流作业。 BigQuery supports up to 1M/rows per second. BigQuery 支持每秒高达 1M/行。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM