简体   繁体   中英

Suppress stacktrace logging in spring kafka container errors

I'm using a simple spring boot app to consume from a kafka topic and configured my container factory with SeekToCurrentErrorHandler

@Bean
    public ConcurrentKafkaListenerContainerFactory<Object, Object> kafkaListenerContainerFactory(ConcurrentKafkaListenerContainerFactoryConfigurer configurer,
                                                                                                 ConsumerFactory<Object, Object> kafkaConsumerFactory) {
        ConcurrentKafkaListenerContainerFactory<Object, Object> factory = new ConcurrentKafkaListenerContainerFactory<>();
        ...
        final SeekToCurrentErrorHandler seekToCurrentErrorHandler = new SeekToCurrentErrorHandler((consumerRecord, e) -> log.error("Error reading from topic: {}, Offset: {}, Partition: {}, Key: {}, Exception: {}",
                consumerRecord.topic(),
                consumerRecord.offset(),
                consumerRecord.partition(),
                consumerRecord.key(),
                ExceptionUtils.getRootCauseMessage(e)), new FixedBackOff(1000L, 3L));
        seekToCurrentErrorHandler.removeNotRetryableException(DeserializationException.class); /*This line is intentional as its easier for me to simulate the issue this way.*/
        factory.setErrorHandler(seekToCurrentErrorHandler);
        ...
}

The SeekToCurrentErrorHandler is printing the entire stacktrace of any exception (DeserializationException in this case) that occurs before the poll() returns to the listener.

This is resulting in a ton of logs entries. Is there a way to suppress this behavior. I tried this tip from the documentation but it is only changing the log level and not suppressing the stacktrace.

/**
 * Set the level at which the exception thrown by this handler is logged.
 * @param logLevel the level (default ERROR).
 */
public void setLogLevel(KafkaException.Level logLevel) {
    ...
}

I upgraded all dependencies to the latest versions.

  • Spring boot - 2.5.7
  • Spring - 5.3.13
  • Spring Kafka - 2.7.9
  • Kafka Clients - 2.8.1

StackTrace:

2021-11-19 12:37:53.047  INFO 560 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.SubscriptionState    : [Consumer clientId=consumer-MyListenerGroup-Testing-1, groupId=MyListenerGroup-Testing] Resetting offset for partition some-topic-0 to position FetchPosition{offset=7763, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[someserver.testing.net:9092 (id: 30 rack: null)], epoch=8}}.
2021-11-19 12:37:54.554 DEBUG 560 --- [ntainer#0-0-C-1] essageListenerContainer$ListenerConsumer : Received: 2 records
2021-11-19 12:37:55.570  INFO 560 --- [ntainer#0-0-C-1] o.a.k.clients.consumer.KafkaConsumer     : [Consumer clientId=consumer-MyListenerGroup-Testing-1, groupId=MyListenerGroup-Testing] Seeking to offset 7763 for partition some-topic-0
2021-11-19 12:37:55.578 ERROR 560 --- [ntainer#0-0-C-1] essageListenerContainer$ListenerConsumer : Error handler threw an exception
org.springframework.kafka.KafkaException: Seek to current after exception; nested exception is org.springframework.kafka.listener.ListenerExecutionFailedException: Listener failed; nested exception is org.springframework.kafka.support.serializer.DeserializationException: failed to deserialize; nested exception is org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id 27
    at org.springframework.kafka.listener.SeekUtils.seekOrRecover(SeekUtils.java:206) ~[spring-kafka-2.7.9.jar:2.7.9]
    at org.springframework.kafka.listener.SeekToCurrentErrorHandler.handle(SeekToCurrentErrorHandler.java:112) ~[spring-kafka-2.7.9.jar:2.7.9]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeErrorHandler(KafkaMessageListenerContainer.java:2371) [spring-kafka-2.7.9.jar:2.7.9]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:2240) [spring-kafka-2.7.9.jar:2.7.9]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:2154) [spring-kafka-2.7.9.jar:2.7.9]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:2036) [spring-kafka-2.7.9.jar:2.7.9]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:1709) [spring-kafka-2.7.9.jar:2.7.9]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeIfHaveRecords(KafkaMessageListenerContainer.java:1276) [spring-kafka-2.7.9.jar:2.7.9]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1268) [spring-kafka-2.7.9.jar:2.7.9]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1163) [spring-kafka-2.7.9.jar:2.7.9]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_292]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_292]
    at java.lang.Thread.run(Thread.java:748) [na:1.8.0_292]
Caused by: org.springframework.kafka.listener.ListenerExecutionFailedException: Listener failed; nested exception is org.springframework.kafka.support.serializer.DeserializationException: failed to deserialize; nested exception is org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id 27
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.decorateException(KafkaMessageListenerContainer.java:2387) [spring-kafka-2.7.9.jar:2.7.9]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.checkDeser(KafkaMessageListenerContainer.java:2434) [spring-kafka-2.7.9.jar:2.7.9]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeOnMessage(KafkaMessageListenerContainer.java:2302) [spring-kafka-2.7.9.jar:2.7.9]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:2229) [spring-kafka-2.7.9.jar:2.7.9]
    ... 9 common frames omitted
Caused by: org.springframework.kafka.support.serializer.DeserializationException: failed to deserialize; nested exception is org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id 27
    at org.springframework.kafka.support.serializer.ErrorHandlingDeserializer.deserializationException(ErrorHandlingDeserializer.java:216) ~[spring-kafka-2.7.9.jar:2.7.9]
    at org.springframework.kafka.support.serializer.ErrorHandlingDeserializer.deserialize(ErrorHandlingDeserializer.java:191) ~[spring-kafka-2.7.9.jar:2.7.9]
    at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:1386) ~[kafka-clients-2.8.1.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher.access$3400(Fetcher.java:133) ~[kafka-clients-2.8.1.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.fetchRecords(Fetcher.java:1617) ~[kafka-clients-2.8.1.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.access$1700(Fetcher.java:1453) ~[kafka-clients-2.8.1.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:686) ~[kafka-clients-2.8.1.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:637) ~[kafka-clients-2.8.1.jar:na]
    at org.apache.kafka.clients.consumer.KafkaConsumer.pollForFetches(KafkaConsumer.java:1303) ~[kafka-clients-2.8.1.jar:na]
    at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1237) ~[kafka-clients-2.8.1.jar:na]
    at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1210) ~[kafka-clients-2.8.1.jar:na]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_292]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_292]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_292]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_292]
    at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:344) ~[spring-aop-5.3.13.jar:5.3.13]
    at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:208) ~[spring-aop-5.3.13.jar:5.3.13]
    at com.sun.proxy.$Proxy140.poll(Unknown Source) ~[na:na]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doPoll(KafkaMessageListenerContainer.java:1414) [spring-kafka-2.7.9.jar:2.7.9]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1251) [spring-kafka-2.7.9.jar:2.7.9]
    ... 4 common frames omitted
Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id 27
Caused by: org.apache.avro.AvroTypeException: Found com.foo.bar.CustomerRelationshipStateNew, expecting com.foo.bar.CustomerRelationshipStateNew, missing required field Status
    at org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:309) ~[avro-1.9.2.jar:1.9.2]
    at org.apache.avro.io.parsing.Parser.advance(Parser.java:86) ~[avro-1.9.2.jar:1.9.2]
    at org.apache.avro.io.ResolvingDecoder.readFieldOrder(ResolvingDecoder.java:128) ~[avro-1.9.2.jar:1.9.2]
    at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:239) ~[avro-1.9.2.jar:1.9.2]
    at org.apache.avro.specific.SpecificDatumReader.readRecord(SpecificDatumReader.java:123) ~[avro-1.9.2.jar:1.9.2]
    at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:179) ~[avro-1.9.2.jar:1.9.2]
    at org.apache.avro.generic.GenericDatumReader.readArray(GenericDatumReader.java:298) ~[avro-1.9.2.jar:1.9.2]
    at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:183) ~[avro-1.9.2.jar:1.9.2]
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:160) ~[avro-1.9.2.jar:1.9.2]
    at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:187) ~[avro-1.9.2.jar:1.9.2]
    at org.apache.avro.specific.SpecificDatumReader.readField(SpecificDatumReader.java:136) ~[avro-1.9.2.jar:1.9.2]
    at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:247) ~[avro-1.9.2.jar:1.9.2]
    at org.apache.avro.specific.SpecificDatumReader.readRecord(SpecificDatumReader.java:123) ~[avro-1.9.2.jar:1.9.2]
    at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:179) ~[avro-1.9.2.jar:1.9.2]
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:160) ~[avro-1.9.2.jar:1.9.2]
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:153) ~[avro-1.9.2.jar:1.9.2]
    at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer$DeserializationContext.read(AbstractKafkaAvroDeserializer.java:351) ~[kafka-avro-serializer-5.5.6.jar:na]
    at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:99) ~[kafka-avro-serializer-5.5.6.jar:na]
    at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:78) ~[kafka-avro-serializer-5.5.6.jar:na]
    at io.confluent.kafka.serializers.KafkaAvroDeserializer.deserialize(KafkaAvroDeserializer.java:55) ~[kafka-avro-serializer-5.5.6.jar:na]
    at org.apache.kafka.common.serialization.Deserializer.deserialize(Deserializer.java:60) ~[kafka-clients-2.8.1.jar:na]
    at org.springframework.kafka.support.serializer.ErrorHandlingDeserializer.deserialize(ErrorHandlingDeserializer.java:188) ~[spring-kafka-2.7.9.jar:2.7.9]
    at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:1386) ~[kafka-clients-2.8.1.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher.access$3400(Fetcher.java:133) ~[kafka-clients-2.8.1.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.fetchRecords(Fetcher.java:1617) ~[kafka-clients-2.8.1.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.access$1700(Fetcher.java:1453) ~[kafka-clients-2.8.1.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:686) ~[kafka-clients-2.8.1.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:637) ~[kafka-clients-2.8.1.jar:na]
    at org.apache.kafka.clients.consumer.KafkaConsumer.pollForFetches(KafkaConsumer.java:1303) ~[kafka-clients-2.8.1.jar:na]
    at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1237) ~[kafka-clients-2.8.1.jar:na]
    at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1210) ~[kafka-clients-2.8.1.jar:na]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_292]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_292]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_292]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_292]
    at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:344) ~[spring-aop-5.3.13.jar:5.3.13]
    at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:208) ~[spring-aop-5.3.13.jar:5.3.13]
    at com.sun.proxy.$Proxy140.poll(Unknown Source) ~[na:na]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doPoll(KafkaMessageListenerContainer.java:1414) [spring-kafka-2.7.9.jar:2.7.9]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1251) [spring-kafka-2.7.9.jar:2.7.9]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1163) [spring-kafka-2.7.9.jar:2.7.9]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_292]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_292]
    at java.lang.Thread.run(Thread.java:748) [na:1.8.0_292]

OK. I see now.

So, I'd suggest you to implement your own RemainingRecordsErrorHandler with delegation to that SeekToCurrentErrorHandler calling super in this method:

public void handle(Exception thrownException, @Nullable List<ConsumerRecord<?, ?>> records,
        Consumer<?, ?> consumer, MessageListenerContainer container) {

So, you can catch all the exception from that delegate and re-throw them whatever way you need, eg ditching those stack traces from the KafkaException s.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM