简体   繁体   English

如何使 Kafka Connect BigQuery Sink Connector 为每种事件类型而不是每个主题创建一个表?

[英]How to make the Kafka Connect BigQuery Sink Connector create one table per event type and not per topic?

I'm using confluentinc/kafka-connect-bigquery on our Kafka (Avro) events.我在我们的 Kafka (Avro) 事件中使用confluentinc/kafka-connect-bigquery On some topics, we have more than one event type, eg, UserRegistered and UserDeleted are on the topic domain.user .在某些主题上,我们有不止一种事件类型,例如, UserRegisteredUserDeleted在主题domain.user上。

The subjects in our Schema Registry look as follows.我们的模式注册表中的主题如下所示。

curl --silent -X GET http://avro-schema-registry.core-kafka.svc.cluster.local:8081/subjects | jq .
[...]
  "domain.user-com.acme.message_schema.domain.user.UserDeleted",
  "domain.user-com.acme.message_schema.domain.user.UserRegistered",
  "domain.user-com.acme.message_schema.type.domain.key.DefaultKey",
[...]

My properties/connector.properties (I'm using the quickstart folder.) looks as follows:我的properties/connector.properties (我使用的是quickstart文件夹。)如下所示:

[...]
topics.regex=domain.*
sanitizeTopics=true
autoCreateTables=true
[...]

In BigQuery a table called domain_user is created.在 BigQuery 中,创建了一个名为domain_user的表。 However, I would like to have two tables, eg, domain_user_userregistered and domain_user_userdeleted or similar, because the schemas of these two event types are quite different.但是,我希望有两个表,例如domain_user_userregistereddomain_user_userdeleted或类似的,因为这两种事件类型的架构完全不同。 How can I achieve this?我怎样才能做到这一点?

I think you can use the SchemaNameToTopic Single Message Transform to do this.我认为您可以使用SchemaNameToTopic Single Message Transform 来执行此操作。 By setting the topic name as the schema name this will propagate through to the name given to the created BigQuery table.通过将主题名称设置为架构名称,这将传播到为创建的 BigQuery 表提供的名称。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Kafka sink connect - 如何为每个主题(表)创建一个任务 - Kafka sink connect - How to create one task per topic(table) Kafka Connect 连接器限制为每个主题一个接收器任务 - Kafka Connect connector limiting to one sink task per topic Kafka 连接 - 每个连接器有多少任务 - Kafka connect - how many tasks per connector Kafka Connect:单个连接器或每个表连接器的方法 - Kafka Connect: Single connector or connector per table approach 每个连接器的Kafka Connect日志 - Kafka Connect Logs per connector 每个架构的JDBC Confluent kafka连接器和主题 - JDBC Confluent kafka Connector and Topic per schema Kafka Connect JDBC Sink-一个接收器配置中每个主题(表)的pk.fields - Kafka Connect JDBC Sink - pk.fields for each topic (table) in one sink configuration 为什么 Flink Table SQL API upsert-kafka sink connector 不创建日志压缩主题? - Why Flink Table SQL API upsert-kafka sink connector doesn't create a log compacted topic? Kafka Connect Bigquery Sink Connector - 关闭期间偏移提交失败 - Kafka Connect Bigquery Sink Connector - Offset commit failed during close 1 个主题映射到两个不同的数据库表 Kafka Sink Connector - 1 topic maps to two different db table Kafka Sink Connector
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM