简体   繁体   English

将 JSON 流式传输到 Bigquery

[英]Streaming JSON into Bigquery

From the Google Drive API I receive an array of Structs type File .从 Google Drive API 我收到一个Structs 类型的数组 File My aim is to add a few fields and stream the data into BigQuery.我的目标是将一些字段和 stream 数据添加到 BigQuery 中。

My first approach was to change the File Struct and stream the updated Structs to BigQuery.我的第一种方法是将文件结构和 stream 更新的结构更改为 BigQuery。 This looks like a dead end and I am trying to use the suggested method, Marshall the Struct into JSON and stream that into BigQuery.这看起来像一个死胡同,我正在尝试使用建议的方法,将结构马歇尔到 JSON 和 stream 到 BigQuery 中。

I found this example bigquery-table-insert-rows , but that implements the ValueSaver interface.我找到了这个示例bigquery-table-insert-rows ,但它实现了 ValueSaver 接口。 For me a simple Marshall and then stream the JSON to BigQuery should be enough.对我来说,一个简单的 Marshall 然后 stream JSON 到 BigQuery 就足够了。

However, I can't find any method or example that does that.但是,我找不到任何这样做的方法或示例。 So I would like to know if it is possible, stream JSON into BigQuery using Go.所以我想知道是否有可能,stream JSON 使用 Go 进入 BigQuery。 An basic example would be great.一个基本的例子会很棒。

You are on the right track in thinking that just providing the structs is enough.您认为仅提供结构就足够了,您走在正确的轨道上。

Maybe looking at some simple but complete code will be of some help: https://github.com/tovare/idporten也许看一些简单但完整的代码会有所帮助: https://github.com/tovare/idporten

All I do is put a slice of structs, where the structs are annotated with BigQuery.我所做的只是放置一片结构,其中结构用 BigQuery 进行注释。

type Metric struct {
    Timestamp time.Time `bigquery:"timestamp"`
    Metode    string    `bigquery:"metode"`
    Antall    int       `bigquery:"antall"`
}

.... ……

seriesTableRef := client.Dataset(datasetName).Table(tableName)
    if err := seriesTableRef.Inserter().Put(ctx, metrics); err != nil {
        return err
    }

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM