简体   繁体   中英

Kafka data loss, in producer

I have been trying to configure one Kafka broker, one topic, one producer, one consumer. when producer produces , if the Broker goes down, loss of data happens, eg:

In Buffer:
Datum 1 - published
Datum 2 - published
.
. ---->(Broker goes down for a while and reconnects...)
.
Datum 4 - published
Datum 5 - published

Properties Configured for Producer are:

bootstrap.servers=localhost:9092
acks=all
retries=1
batch.size=16384
linger.ms=2
buffer.memory=33554432
key.serializer=org.apache.kafka.common.serialization.IntegerSerializer
value.serializer=org.apache.kafka.common.serialization.StringSerializer
producer.type=sync
buffer.size=102400
reconnect.interval=30000
request.required.acks=1

The data size lesser than the configured buffer size.. Help me know where I am going wrong...!

Not sure what you exactly do. I would assume that the messages you try to write to Kafka while broker is down are not acked by Kafka. If a message is not acked, it indicates that the message was not written to Kafka and producer needs to re-try to write the message.

The easiest way to do this, is by setting the configuration parameters retries and retry.backoff.ms accordingly.

At application level, you can also register a Callback in send(..., Callback) to get informed about success/failure. In case of failure, you could retry sending by calling send() again.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM