简体   繁体   中英

Is it possible to define a schema for Google Pub/Sub topics like in Kafka with AVRO?

As far as I know, we can define AVRO schemas on Kafka and the topic defined with this schema will only accept the data matching with that schema. It's really useful to validate data structure before accepting into the queue.

Is there anything similar in Google Pub/Sub?

Kafka itself is not validating a schema, and topics therefore do not inherently have schemas other than a pair of byte arrays plus some metadata. It's the serializer that's part of the producing client that performs the validation before the data reaches the topic. Similarly in PubSub, at the end of the day, it is only storing/sending byte[] data.

Therefore, in theory, it's completely feasible to use something similar to the Confluent Avro Schema Registry on either end of the data that moves through PubSub. Google does not offer such a feature, AFAIK, so you would need to recreate said service that can perform your Avro compatibility checks, plus tie a PubSub serialization+producer client around that service's client. For instance, you can run the Registry itself as a container in GKE to begin with.

Might want to check out Avro message for Google Cloud Pub-Sub?

Yes, you can use schemas to validate Google Cloud Pub/Sub topics and messages.

See the docs here . AVRO and Protobuf are supported.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM