[英]Kafka Streams: Store is not ready
We recently upgraded Kafka to v1.1 and Confluent to v4.0.But upon upgrading we have encountered a persistent problems regarding state stores. 我们最近将Kafka升级到v1.1,将Confluent升级到v4.0。但是在升级后,我们遇到了有关状态存储的持续问题。 Our application starts a collection of streams and we check for the state stores to be ready before killing the application after 100 tries.
我们的应用程序开始收集流,并在100次尝试后杀死该应用程序之前检查状态存储区是否准备就绪。 But after the upgrade there's atleast one stream that will have
Store is not ready : the state store, <your stream>, may have migrated to another instance
The stream itself has RUNNING
state and the messages will flow through but the state of the store still shows up as not ready. 但是升级后,至少有一个流将没有
Store is not ready : the state store, <your stream>, may have migrated to another instance
流本身具有RUNNING
状态,消息将流过,但存储状态仍然显示为未准备好。 So I have no idea as to what may be happening. 所以我不知道会发生什么。
We are running Kafka in cluster with 3 brokers.Below is a sample stream (not the entire code): 我们在3个代理的集群中运行Kafka,下面是一个示例流(不是整个代码):
public BaseStream createStreamInstance() {
final Serializer<JsonNode> jsonSerializer = new JsonSerializer();
final Deserializer<JsonNode> jsonDeserializer = new JsonDeserializer();
final Serde<JsonNode> jsonSerde = Serdes.serdeFrom(jsonSerializer, jsonDeserializer);
MessagePayLoadParser<Note> noteParser = new MessagePayLoadParser<Note>(Note.class);
GenericJsonSerde<Note> noteSerde = new GenericJsonSerde<Note>(Note.class);
StreamsBuilder builder = new StreamsBuilder();
//below reducer will use sets to combine
//value1 in the reducer is what is already present in the store.
//value2 is the incoming message and for notes should have max 1 item in it's list (since its 1 attachment 1 tag per row, but multiple rows per note)
Reducer<Note> reducer = new Reducer<Note>() {
@Override
public Note apply(Note value1, Note value2) {
value1.merge(value2);
return value1;
}
};
KTable<Long, Note> noteTable = builder
.stream(this.subTopic, Consumed.with(jsonSerde, jsonSerde))
.map(noteParser::parse)
.groupByKey(Serialized.with(Serdes.Long(), noteSerde))
.reduce(reducer);
noteTable.toStream().to(this.pubTopic, Produced.with(Serdes.Long(), noteSerde));
this.stream = new KafkaStreams(builder.build(), this.properties);
return this;
}
There are some open questions here, like the ones Matthias put on comment, but will try to answer/give help to your actual questions: 这里有一些未解决的问题,例如马蒂亚斯(Matthias)提出的问题,但会尝试回答/为您的实际问题提供帮助:
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.