I'm working in producer side of kafka to push message in topic. I'm using confluent-kafka
avro producer.
Below are the my schema .avsc
files.
Keys.avsc
{
"namespace": "io.codebrews.schema.test",
"type": "record",
"name": "Keys",
"fields": [
{
"name": "name",
"type": "string"
},
{
"name": "email",
"type": "string"
}
]
}
Test.avsc
{
"namespace": "io.codebrews.schema.test",
"type": "record",
"name": "Subscription",
"fields": [
{
"name": "test",
"type": "string"
},
{
"name": "keys",
"type": "io.codebrews.schema.test.Keys"
}
]
}
Producer.py
key_schema, value_schema = load_avro_schema_from_file('Subscription.avsc')
try:
producer = avro.AvroProducer(producer_config, default_key_schema=key_schema, default_value_schema=value_schema)
except Exception as e:
raise e
def load_avro_schema_from_file(schema_file):
key_schema_string = """
{"type": "string"}
"""
key_schema = avro.loads(key_schema_string)
value_schema = avro.load("./avro/" + schema_file)
return key_schema, value_schema
When I try to register Keys.avsc
it works fine with no error. But when I try to register Test.avsc
after registring Keys.avsc
. I get below error.
confluent_kafka.avro.error.ClientError: Schema parse failed: Unknown named schema 'io.codebrews.schema.test.Keys', known names: ['io.codebrews.schema.test.Subscription'].
After registering the schema manually.
{
"namespace": "io.codebrews.schema.test",
"type": "record",
"name": "Subscription",
"fields": [
{
"name": "test",
"type": "string"
},
{
"name": "keys",
"type": "Keys"
}
]
}
When push message in my topic I get below error.
ClientError: Incompatible Avro schema:409 message:{'error_code': 409, 'message': 'Schema being registered is incompatible with an earlier schema for subject "test-value".
Am I doing something wrong here?
Also can anyone help me how to stop auto schema registration in python?
The error is related to the parser, not the registry registration...
Your AVSC files need to be fully inclusive of all record types. When the parser reads one file, it doesn't have a way to know about the others
If you start with an AVDL file, then convert that into AVSC, the records do properly get embedded in the outer records.
Specifically,
"fields" :[{
"name": "keys",
"type": {
"type": "record",
"namespace": "io.codebrews.schema.test.Keys",
...
}
}
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.