[英]Firebase Analytics data to Redshift - BigQuery RECORD data type to Redshift
I need to export Firebase Analytics data that is currently in BigQuery to Redshift.我需要将当前位于 BigQuery 中的 Firebase Analytics 数据导出到 Redshift。
The issue I have is with fields in BigQuery that are of the RECORD
data type (screenshot of the BQ schema below):我遇到的问题是 BigQuery 中属于RECORD
数据类型的字段(以下 BQ 架构的屏幕截图):
How would I store these key/values in Redshift?我如何将这些键/值存储在 Redshift 中? I was initially thinking of creating an event_params
Dimension table that is linked back to the main events
table by a UUID.我最初想创建一个event_params
维度表,该表通过 UUID 链接回主events
表。
But as there are multiple key/value pairs per event, each event in the events
table will have multiple rows in the Dimension table - which in my mind, will still be messy when running reports on the data.但是由于每个事件有多个键/值对, events
表中的每个事件将在维度表中有多个行——在我看来,在对数据运行报告时仍然会很混乱。
How should I model this data in Redshift?我应该如何在 Redshift 中 model 这个数据?
Your data model design should be driven by your reporting requirements: how you need to report on your event parameters will inform how your model needs to be designed.您的数据 model 设计应由您的报告要求驱动:您需要如何报告您的事件参数将告知您的 model 需要如何设计。
Given that caveat, if you just want to add these event parameters to an existing event star (and I assume event is a fact table with a grain of one record per event) then you would need to implement a bridge table between the event fact table and the event parameter dimension table.考虑到这一点,如果您只想将这些事件参数添加到现有的事件之星(我假设事件是一个事实表,每个事件有一条记录),那么您需要在事件事实表之间实现一个桥接表以及事件参数维度表。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.