简体   繁体   English

流入bigquery时是否会丢失数据

[英]Can there be data loss when streaming into bigquery

我目前正在将一些数据推送到bigquery中的表中,并且我可以看到过去两天的数据丢失,但是我可以在HTTP 200日志中看到。是否有可能最终导致BQ流缓冲区丢失数据?

Are you inspecting the full response for tabledata.insertAll beyond the HTTP response code? 您是否正在检查HTTP响应代码以外的tabledata.insertAll的完整响应? Streaming may return http 200 responses but include additional information about specific rows with issues in the request batch: https://cloud.google.com/bigquery/troubleshooting-errors#streaming-success 流式处理可能会返回http 200响应,但会在请求批处理中包含有关特定行的其他信息: https : //cloud.google.com/bigquery/troubleshooting-errors#streaming-success

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 BigQuery的数据流失败 - Data streaming fails with BigQuery 从Google Search Console API向BigQuery流数据时发生延迟 - Delays when streaming data from the Google Search console API to BigQuery 如何将数据从 Google Analytic 流式传输到 Bigquery - How can streaming data from Google Analytic to Bigquery Salesforce 数据流到谷歌云 bigQuery 表 - Salesforce data streaming to google cloud bigQuery tables 使用 insertAll 流式传输时 BigQuery 中的“复合键” - “composite key” in BigQuery when streaming with insertAll BigQuery 是否可以在完成将文件从 Cloud Storage 上传到 BQ 或通过 Streaming 插入数据时调用端点? - Is it possible for BigQuery to call an endpoint when it has finished uploading a file from Cloud Storage to BQ or when inserting data via Streaming? 使用 Dataflow 将 csv 数据从 Pub/Sub 订阅流式传输到 BigQuery - Streaming csv data to BigQuery from Pub/Sub subscription using Dataflow 如何制作正确的 json 数据以便对 BigQuery 进行流式插入 - how to make correct json data in order to do streaming insert to BigQuery 将流数据从 API 提取到 Google Cloud 中的 Bigquery - ingest streaming data from API to Bigquery in Google Cloud Google App Engine 将数据流式传输到 Bigquery:GCP 架构 - Google App Engine streaming data into Bigquery: GCP architecture
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM