简体   繁体   中英

Google Cloud Dataflow ETL (Datastore -> Transform -> BigQuery)

We have an application running on Google App Engine using Datastore as persistence back-end. Currently application has mostly 'OLTP' features and some rudimentary reporting. While implementing reports we experienced that processing large amount of data (millions of objects) is very difficult using Datastore and GQL. To enhance our application with proper reports and Business Intelligence features we think its better to setup a ETL process to move data from Datastore to BigQuery.

Initially we thought of implementing the ETL process as App Engine cron job but it looks like Dataflow can also be used for this. We have following requirements for setting up the process

  • Be able to push all existing data to BigQuery by using Non streaming API of BigQuery.
  • Once above is done, push any new data whenever it is updated/created in Datastore to BigQuery using streaming API.

My Questions are

  1. Is Cloud Dataflow right candidate for implementing this pipeline?
  2. Will we be able to push existing data? Some of the Kinds have millions of objects.
  3. What should be the right approach to implement it? We are considering two approaches. First approach is to go through pub/sub ie for existing data create a cron job and push all data to pub/sub. For any new updates push data to pub/sub at the same time it is updated in DataStore. Dataflow Pipeline will pick it from pub/sub and push it to BigQuery. Second approach is to create a batch Pipeline in Dataflow that will query DataStore and pushes any new data to BigQuery.

Question is are these two approaches doable? which one is better cost wise? Is there any other way which is better than above two?

Thank you,

rizTaak

Dataflow can absolutely be used for this purpose. In fact, Dataflow's scalability should make the process fast and relatively easy.

Both of your approaches should work -- I'd give a preference to the second one of using a batch pipeline to move the existing data, and then a streaming pipeline to handle new data via Cloud Pub/Sub. In addition to the data movement, Dataflow allow arbitrary analytics/manipulation to be performed on the data itself.

That said, BigQuery and Datastore can be connected directly. See, for example, Loading Data From Cloud Datastore in BigQuery documentation.

Another way it to use a 3rd party solutions for loading data to Google BigQuery. There are plenty of them here . Most of them are paid, but there are free one with limited data loading frequency. In this case you won't need to code anything.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM