简体   繁体   中英

Google BigQuery Payload size limit of 10485760 bytes

We encountered an error while trying to stream data into bigquery table, it says: payload size limit of 10485760 bytes, anyone has any idea of it? According to the third party integration vendor which we use to move data across from sql server to bigquery table, they advised it is an issue by bigquery?

Thanks. Best regards,

BigQuery has some maximum limitations and also has some quotas policies as you can see here .

The limitations for Streaming are:

  • If you do not populate the insertId field when you insert rows:

    • Maximum rows per second: 1,000,000
    • Maximum bytes per second: 1 GB
  • If you populate the insertId field when you insert rows:

    • Maximum rows per second: 100,000
    • Maximum bytes per second: 100 MB
  • The following additional streaming quotas apply whether or not you populate the insertId field:

    • Maximum row size: 1 MB
    • HTTP request size limit: 10 MB
    • Maximum rows per request: 10,000 rows per request
    • insertId field length: 128

I hope it helps

Indeed the streaming limit is 10MB per request.

Row size is 1MB according to https://cloud.google.com/bigquery/quotas

What you need to do is parallelize the streaming jobs. BigQuery supports up to 1M/rows per second.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM