简体   繁体   中英

Is there any way we can load BigTable data into BigQuery?

I want to load BigTable data into BigQuery with direct way.

Till now I am loading BigTable data into CSV file using Python and then loading csv file into BigQuery.

But I don't want to use csv file in between BigTable and BigQuery is there any direct way?

To add to Mikhail's recommendation, I'd suggest creating a permanent table in BigQuery using the external table. You'll define the schema for the columns you want and then query the rows you're interested in. Once that data is saved into BigQuery, it won't have any impact on your Bigtable performance. If you want to get the latest data, you can create a new permanent table with the same query.

If you're looking to have the data copied over and stored in BigQuery, Querying Cloud Bigtable data using permanent external tables is not what you're looking for. It explicitly mentions that "The data is not stored in the BigQuery table". My understanding is that the permanent table is more for persistent access controls, but still queries Bigtable directly.

This may be overkill, but you could set up and Apache Beam pipeline that runs in Dataflow , has a BigQueryIO source , and a BigTableIO sink . You'd have to write a little bit of transformation logic, but overall it should be a pretty simple pipeline. The only catch here is that the BigTableIO connector is only for the Beam Java SDK , so you'd have to write this pipeline in Java.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM