[英]Flink Avro Serialization shows "not serializable" error when working with GenericRecords
我真的有一個很難作出弗林克與正在運行的卡夫卡實例利用的Avro公司架構從集架構注冊表(適用於鍵和值)正常通信。
經過一段時間的思考和重組我的程序,我能夠推動我的實現:
生產者方法
public static FlinkKafkaProducer<Tuple2<GenericRecord,GenericRecord>> kafkaAvroGenericProducer() {
final Properties properties = new Properties();
properties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "--.-.-.--:9092");
properties.put("schema.registry.url", "http://--.-.-.---:8081");
properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, KVSerializationSchema.class.getName()); //wrong class should not matter
properties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KVSerializationSchema.class.getName()); //wrong class but should not matter
return new FlinkKafkaProducer<Tuple2<GenericRecord,GenericRecord>>("flink_output",
new GenericSerializer("flink_output", schemaK, schemaV, "http://--.-.-.---:8081"),
properties, FlinkKafkaProducer.Semantic.EXACTLY_ONCE);
}
通用串行器.java
package com.reeeliance.flink;
import org.apache.avro.Schema;
import org.apache.avro.generic.GenericRecord;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.streaming.connectors.kafka.KafkaSerializationSchema;
import org.apache.kafka.clients.producer.ProducerRecord;
import flinkfix.ConfluentRegistryAvroSerializationSchema;
public class GenericSerializer implements KafkaSerializationSchema<Tuple2<GenericRecord,GenericRecord>>{
private String topic;
private Schema schemaKey;
private Schema schemaValue;
private String registryUrl;
public GenericSerializer(String topic, Schema schemaK, Schema schemaV, String url) {
super();
this.topic = topic;
this.schemaKey = schemaK;
this.schemaValue = schemaV;
this.registryUrl = url;
}
public GenericSerializer() {
super();
}
@Override
public ProducerRecord<byte[], byte[]> serialize(Tuple2<GenericRecord,GenericRecord> element, Long timestamp) {
byte[] key = ConfluentRegistryAvroSerializationSchema.forGeneric(topic + "-key", schemaKey, registryUrl).serialize(element.f0);
byte[] value = ConfluentRegistryAvroSerializationSchema.forGeneric(topic + "-value", schemaValue, registryUrl).serialize(element.f1);
return new ProducerRecord<byte[], byte[]>(topic, key, value);
}
}
但是,當我執行 Job 時,它在准備階段失敗,沒有 Job 實際運行,並出現以下錯誤:
Exception in thread "main" org.apache.flink.api.common.InvalidProgramException: [H_EQUNR type:STRING pos:0] is not serializable. The object probably contains or references non serializable fields.
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:151)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:126)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:126)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:71)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer.<init>(FlinkKafkaProducer.java:617)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer.<init>(FlinkKafkaProducer.java:571)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer.<init>(FlinkKafkaProducer.java:547)
at com.reeeliance.flink.StreamingJob.kafkaAvroGenericProducer(StreamingJob.java:257)
at com.reeeliance.flink.StreamingJob.main(StreamingJob.java:84)
Caused by: java.io.NotSerializableException: org.apache.avro.Schema$Field
- custom writeObject data (class "java.util.ArrayList")
- root object (class "org.apache.avro.Schema$LockableArrayList", [H_EQUNR type:STRING pos:0])
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1182)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at java.util.ArrayList.writeObject(ArrayList.java:766)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:1140)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at org.apache.flink.util.InstantiationUtil.serializeObject(InstantiationUtil.java:586)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:133)
... 8 more
我知道所有類都必須實現Serializable -Interface 或變為瞬態,但我不使用我自己的類並且錯誤沒有解決一個函數,它不是可序列化的(通常線程處理),而是一個記錄或字段。 該字段來自鍵模式,一個只包含這個字段的模式。 我認為我的錯誤在於使用 GenericRecord,它沒有實現Serializable -Interface,但我看到 GenericRecord 經常用於這種序列化,所以它對我來說沒有意義。
ConfluentRegistryAvroSerializationSchema類取自GitHub ,因為它尚未包含在我們使用的當前 Flink 版本(1.9.1)中。 我包含了必要的類並更改了類,我認為這可能不是我的問題的原因。 (問題已解決)
有人可以幫我調試這個嗎? 我也很感激,如果你能向我展示一種不同的方式來實現同樣的目標,到目前為止 Flink Avro 和 Confluent Schema Registry 的不兼容一直讓我發瘋。
異常消息告訴您哪個類不可序列化。
Caused by: java.io.NotSerializableException: org.apache.avro.Schema$Field
問題在於您存儲在GenericSerializer
字段中的Schema
類。
你可以試試這個:
public class GenericSerializer implements KafkaSerializationSchema<Tuple2<GenericRecord,GenericRecord>>{
private final SerializationSchema<GenericRecord> valueDeserializer;
private final SerializationSchema<GenericRecord> keyDeserializer;
public GenericSerializer(String topic, Schema schemaK, Schema schemaV, String url) {
this.keyDeserializer = ConfluentRegistryAvroSerializationSchema.forGeneric(topic + "-key", schemaKey, registryUrl);
this.valueDeserializer = ConfluentRegistryAvroSerializationSchema.forGeneric(topic + "-value", schemaValue, registryUrl);
}
@Override
public ProducerRecord<byte[], byte[]> serialize(Tuple2<GenericRecord,GenericRecord> element, Long timestamp) {
byte[] key = keySerializer.serialize(element.f0);
byte[] value = valueSerializer.serialize(element.f1);
return new ProducerRecord<byte[], byte[]>(topic, key, value);
}
}
ConfluentRegistryAvroSerializationSchema
是可序列化的,因此您可以安全地將其存儲在GenericSerializer
的字段中。
由於不會為每個傳入記錄重新實例化底層結構,因此它的性能也會更高。
關於 Flink 為 avro 通用記錄回退到 kryo 的問題有任何結論嗎?
我正在使用 Scala 並添加了這樣的類型信息:
implicit val typeInformation: TypeInformation[GenericRecord] = TypeInformation.of( new TypeHint[GenericRecord] {
new GenericRecordAvroTypeInfo(EventMessage.SCHEMA$)
})
流設置如下:
DataStream[GenericRecord]
但是 Flink Runtime 仍然回退到 kryo,因為它無法識別 Avro Generic Record 並將其視為任何泛型類型。
我在部署 flink 作業時遇到同樣的錯誤(由:java.io.NotSerializableException: org.apache.avro.Schema$Field),這是我的序列化程序和設計器。
public static class AVROGeneratorSchema implements SerializationSchema<GenericRecord> {
@Override
public byte[] serialize(GenericRecord genericData) {
StringBuffer sb = new StringBuffer();
sb.append((String) genericData.get("field1"));
sb.append("::");
sb.append((String) genericData.get("field2"));
sb.append("::");
sb.append((String) genericData.get("field3"));
return (sb.toString()).getBytes();
}
}
和沙漠:-
public static class AVROGeneratorDeSchema implements DeserializationSchema<GenericRecord> {
@Override
public GenericRecord deserialize(byte[] bytes) throws IOException {
GenericRecord responseData = new GenericData.Record(new Schema.Parser().parse(schemaStr));
String[] tokens = new String(bytes).split("::");
responseData.put("field1", tokens[0]);
responseData.put("field2", tokens[1]);
responseData.put("field3", tokens[2]);
return responseData;
}
@Override
public TypeInformation<GenericRecord> getProducedType() {
return TypeExtractor.getForClass(GenericRecord.class);
}
@Override
public boolean isEndOfStream(GenericRecord genericData) {
return false;
}
}
這是錯誤堆棧:
Exception in thread "main" org.apache.flink.api.common.InvalidProgramException: [field1 type:STRING pos:0] is not serializable. The object probably contains or references non serializable fields.
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:151)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:126)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:126)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:71)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer.<init>(FlinkKafkaProducer.java:617)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer.<init>(FlinkKafkaProducer.java:571)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer.<init>(FlinkKafkaProducer.java:547)
at com.reeeliance.flink.StreamingJob.kafkaAvroGenericProducer(StreamingJob.java:257)
at com.reeeliance.flink.StreamingJob.main(StreamingJob.java:84)
Caused by: java.io.NotSerializableException: org.apache.avro.Schema$Field
- custom writeObject data (class "java.util.ArrayList")
- root object (class "org.apache.avro.Schema$LockableArrayList", [field1 type:STRING pos:0])
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1182)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at java.util.ArrayList.writeObject(ArrayList.java:766)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:1140)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at org.apache.flink.util.InstantiationUtil.serializeObject(InstantiationUtil.java:586)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:133)
我不想使用 Confluent 模式注冊表,有什么線索嗎?
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.