I have a DTO containing a field of type String
which contains a regex expression. To be a valid Java string, it contains double \
because the first one is used for escaping, which is easy to understand. Like:
private final String regex = "myapp\\.\\w{2,3}\\/confirmation(.*)";
The actual regex to use is myapp\.\w{2,3}\/confirmation(.*)
.
And, I will send this DTO in a Kafka message, and the serialization is done by Jackson.
ProducerRecord<String, String> record = new ProducerRecord<>(
kafkaTopicProperties.getTopic(),
String.valueOf(myDto.getOrderId()),
objectMapper.writeValueAsString(myDto)
);
Understandably, Jackson cannot distinguish normal string and regex string, and will send the Java string escaped as-is. Additionally, it is also invalid to omit the escaping in JSON(at least when I edit a .json
file to delete the escaping \
, IntelliJ shows parsing error), so for a valid JSON, I also need to escape it. Normal till now.
But then, the consumer of Kafka will received a escaped regex string, and will have to de-escape the regex(removing the extra \
). Here comes the problem. A syntatic change results in semantic difference.
Actually because Kafka has no limitations over what to send, we are free to de-escape before sending because it would be plain text.
But, can Jackson do this magic for me?
Thanks for @Bohemian and @NyamiouTheGaleanthrope
Indeed as you said, I think I have found the problem: the deserializer should be org.springframework.kafka.support.serializer.JsonDeserializer
in the Consumer
, and in the producer, the serializer should be org.springframework.kafka.support.serializer.JsonSerializer
. Then all is good, I can see in the log that the regex has no extra escape char. I was using String(De)Serializer
before.
Configs for both sides put together:
application.yml
:
spring:
kafka:
producer:
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
consumer: # would be picked if you autowired the consumer
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
properties.spring.json.trusted.packages: '*'
Producer:
ProducerRecord<String, OrderMessageDto> record = new ProducerRecord<>(
kafkaTopicProperties.getTopic(),
String.valueOf(orderDto.getId()),
orderDto
);
kafkaTemplateOrder.send(record).get(kafkaTopicProperties.getTimeout(), TimeUnit.MILLISECONDS);
Consumer(I only have a consumer in test, so I have to configure by hand):
@Autowired
private EmbeddedKafkaBroker kafkaBroker;
...
private ConsumerRecords consumeRecords(String topic) {
Map<String, Object> consumerProps = KafkaTestUtils.consumerProps(topic, "true", kafkaBroker);
consumerProps.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
consumerProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
consumerProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
JsonDeserializer<OrderPaymentUrlDto> valueDeserializer = new JsonDeserializer<>();
valueDeserializer.addTrustedPackages("*"); // necessary for include DTO as trusted type
ConsumerFactory<String, OrderPaymentUrlDto> factory = new DefaultKafkaConsumerFactory<>(
consumerProps,
new StringDeserializer(),
valueDeserializer
);
Consumer consumer = factory.createConsumer();
kafkaBroker.consumeFromAnEmbeddedTopic(consumer, topic);
return KafkaTestUtils.getRecords(consumer, 2000);
}
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.