I am using Avro Producer in Python 2.7. I need to send a message with a key and value, the value has Avro-Schema in the topic, but there is not Avro-Schema for the key (I can't add Schema for the key - legacy reasons).
This is my Code:
def main():
kafkaBrokers = os.environ.get('KAFKA_BROKERS')
schemaRegistry = os.environ.get('SCHEMA_REGISTRY')
topic = os.environ.get('KAFKA_TOPIC')
subject = '${}-value'.format(topic)
sr = CachedSchemaRegistryClient(schemaRegistry)
schema = sr.get_latest_schema(subject).schema
value_schema = avro.loads(str(schema))
url = 'test.com'
value = {'url': u'test.com', 'priority': 10}
avroProducer = AvroProducer({
'bootstrap.servers': kafkaBrokers,
'schema.registry.url': schemaRegistry
}, default_value_schema=value_schema)
key = 1638895406382020875
avroProducer.produce(topic=topic, value=value, key=key)
avroProducer.flush()
I get the following error:
raise KeySerializerError("Avro schema required for key")
confluent_kafka.avro.serializer.KeySerializerError: Avro schema required for key
If I delete the key from the produce function:
avroProducer.produce(topic=topic, value=value)
It works.
How is it possible to send the key without having schema?
You'll need to use regular Producer and execute the serialization functions yourself
from confluent_kafka import avro
from confluent_kafka.avro import CachedSchemaRegistryClient
from confluent_kafka.avro.serializer.message_serializer import MessageSerializer as AvroSerializer
avro_serializer = AvroSerializer(schema_registry)
serialize_avro = avro_serializer.encode_record_with_schema # extract function definition
value_schema = avro.load('avro_schemas/value.avsc') # TODO: Create avro_schemas folder
p = Producer({'bootstrap.servers': bootstrap_servers})
value_payload = serialize_avro(topic, value_schema, value, is_key=False)
p.produce(topic, key=key, value=value_payload, callback=delivery_report)
AvroProducer
assumes that both keys and values are encoded with the schema registry, prepending a magic byte and the schema id to the payload of both the key and the value.
If you want to use a custom serialization for the key, you could use a Producer
instead of an AvroProducer
. But it will be your responsibility to serialize the key (using whatever format you want) and the values (which means encoding the value and prepending the magic byte and the schema id). To find out how this is done you can look at the AvroProducer
code.
But it also means you'll have to write your own AvroConsumer
and won't be able to use the kafka-avro-console-consumer
.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.