简体   繁体   中英

Loading data from BigQuery into Redis

I'm trying to load data from BigQuery into Redis, and after going through their documentation for the last 3 days, I am turning to SO, because I found nothing concrete. So, what would be a good way to load the results of multiple queries, each about a few hundred records, from BigQuery into Redis? Are there any code samples floating around that at least show how to translate a resultset into key-value pairs that are suited to be loaded into Redis? I'd like to use Python to implement this pipeline. Please help. Thanks!

Edit: We have a few datasets which we'd like to push to Redis, so that when the user runs a query, it runs against Redis, and not BigQuery. I have a snippet that pulls data from a query into a data frame in Python. I have been unable to find code snippets for loading this data, or even translating this into key-value pairs for consumption into Redis. This is the part I need help with.

  1. Export the BigQuery table to a GCS bucket as new line delimitated JSON
  2. Use a Google Cloud Function triggered by GCS object changes to read the JSON file when it changes
  3. Attach a Serverless VPC connector to the Google Cloud Function so it can run a Redis client. It can then write the contends of the file to Redis.

I ended putting the Redis key to use in the BigQuery row, but thats details you can figure out yourself. I wrote a full blog about it here https://www.futurice.com/blog/bigquery-to-memorystore

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM