簡體   English   中英

Spring 雲 stream 卡夫卡。 想發信息<object>但 Spring 發送消息<byte[]> . 有效負載是 Byte[] 而不是 JSON 格式的 GenericMessage<div id="text_translate"><p> 我將 Spring 雲 Stream 與 kafka、avro、模式注冊表一起使用。 我從事函數式風格的反應式編程。 我想產生這樣的消息。 GenericMessage [payload={"id": "efb90cd6-e022-4d82-9898-6b78114cfb01", "type": "FirstRankPaymentAgreed",...}, headers={deliveryAttempt=1, kafka_timestampType=CREATE_TIME, kafka_receivedMessageKey=cucumber-test1, contentType=application/json...}]</p><p> 但它會產生這樣的信息。 GenericMessage [payload=byte[2151], headers={deliveryAttempt=1, kafka_timestampType=CREATE_TIME, kafka_receivedMessageKey=cucumber-test1, contentType=application/json...}]</p><pre> spring-cloud-stream-schema 2.2.1.RELEASE spring-cloud-stream 3.2.2 spring-cloud-starter-stream-kafka 3.2.5 spring-cloud-stream-binder-kafka-streams 3.2.5 spring-cloud-function-context 3.2.1 kafka-avro-serializer 5.3.0 spring-kafka 2.9.0 org.apache.avro.avro 1.11.1</pre><p> 我正在使用 Reactive Functional Style。</p><pre> Function<Flux<Message<Object>>, Flux<Message>> handler() return Mono.just(Message<FirstRankPaymentAgreed> messageMessageBuilder = MessageBuilder.withPayload((FirstRankPaymentAgreed) message.getPayload()).build());</pre><p> 此生產者的結果是收到消息: GenericMessage [payload=byte[1056], headers={contentType=application/json, id=7d3b65c1-11d8-0fb2-a277-0603f58fd516, timestamp=1672174971194}]</p><p> 在有效負載中,我們有字節數組而不是 JSON。</p><p> 我想要這樣的GenericMessage [payload={"id": "254335d0-b631-454e-98de-d2d5129af4c0", "type": "ObjectClass", "delta"...</p><pre> cloud: stream: function: definition: dispatchConsumerFromTempoComposerEvent bindings: dispatchConsumerFromTempoComposerEvent-in-0: destination: tempo-composer-event dispatchConsumerFromTempoComposerEvent-out-0: destination: tempo-composer-event contentType: application/json --> i try to add this kafka: binder: auto-create-topics: false consumer-properties: value: subject: name: strategy: io.confluent.kafka.serializers.subject.TopicRecordNameStrategy key.deserializer: org.apache.kafka.common.serialization.StringDeserializer value.deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer schema.registry.url: http://localhost:8081 specific.avro.reader: true producer-properties: value: subject: name: strategy: io.confluent.kafka.serializers.subject.TopicRecordNameStrategy --> i try to add this key.serializer: org.apache.kafka.common.serialization.StringSerializer value.serializer: io.confluent.kafka.serializers.KafkaAvroSerializer schema.registry.url: http://localhost:8081</pre><p> 我想將項目遷移到 spring 雲 stream。“遺留”代碼:</p><pre> private static final JsonGenericRecordReader recordReader = new JsonGenericRecordReader(new CompositeJsonToAvroReader(List.of(), IGNORE_UNKNOWN_FIELD));</pre><pre> private static KafkaProducer<String, Object> buildProducer() { final var config = new Properties(); config.put("bootstrap.servers", KafkaConfiguration.kafkaHost()); config.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer"); config.put("value.serializer", "io.confluent.kafka.serializers.KafkaAvroSerializer"); config.put("schema.registry.url", "http://" + KafkaConfiguration.schemaRegistryHost()); config.put("value.subject.name.strategy", TopicRecordNameStrategy.class.getName()); return new KafkaProducer<>(config); }</pre><pre> final var getClassSchema = avroClass.getMethod("getClassSchema"); final var specificRecord = recordReader.read(record.getBytes(StandardCharsets.UTF_8), (Schema) getClassSchema.invoke(null)); final var producerRecord = new ProducerRecord<String, Object>(topic, key, specificRecord); if (TestContext.traceId().= null) { producerRecord.headers(),add("b3". (TestContext.traceId() + "-" + TestContext.traceId() + "-1");getBytes()). } headers,forEach((name. value) -> producerRecord.headers(),add(name. value;getBytes())). TestContext.additionalKafkaHeaders(),forEach((name. value) -> producerRecord.headers(),add(name. value;getBytes())). RecordMetadata recordMetadata = getProducer().send(producerRecord);get(); return recordMetadata;</pre><p> 這個遺留代碼產生</p><pre>GenericMessage [payload={"id": "efb90cd6-e022-4d82-9898-6b78114cfb01", "type": "FirstRankPaymentAgreed",...}, headers={deliveryAttempt=1, kafka_timestampType=CREATE_TIME, kafka_receivedMessageKey=cucumber-test1, contentType=application/json...}]</pre><p> 有效載荷為 JSON 格式。 在我遷移 spring 雲 stream 之后,我有 [payload=byte[2151]...</p></div></byte[]></object>

[英]Spring cloud stream kafka. Want to send Message<Object> but Spring send Message<Byte[]>. The payload is Byte[] not GenericMessage in JSON format

我將 Spring 雲 Stream 與 kafka、avro、模式注冊表一起使用。 我從事函數式風格的反應式編程。 我想產生這樣的消息。 GenericMessage [payload={"id": "efb90cd6-e022-4d82-9898-6b78114cfb01", "type": "FirstRankPaymentAgreed",...}, headers={deliveryAttempt=1, kafka_timestampType=CREATE_TIME, kafka_receivedMessageKey=cucumber-test1, contentType=application/json...}]

但它會產生這樣的信息。 GenericMessage [payload=byte[2151], headers={deliveryAttempt=1, kafka_timestampType=CREATE_TIME, kafka_receivedMessageKey=cucumber-test1, contentType=application/json...}]

spring-cloud-stream-schema 2.2.1.RELEASE
spring-cloud-stream 3.2.2
spring-cloud-starter-stream-kafka 3.2.5
spring-cloud-stream-binder-kafka-streams 3.2.5
spring-cloud-function-context 3.2.1
kafka-avro-serializer 5.3.0
spring-kafka 2.9.0
org.apache.avro.avro 1.11.1

我正在使用 Reactive Functional Style。

Function<Flux<Message<Object>>, Flux<Message>> handler()

return Mono.just(Message<FirstRankPaymentAgreed> messageMessageBuilder = MessageBuilder.withPayload((FirstRankPaymentAgreed) message.getPayload()).build());

此生產者的結果是收到消息: GenericMessage [payload=byte[1056], headers={contentType=application/json, id=7d3b65c1-11d8-0fb2-a277-0603f58fd516, timestamp=1672174971194}]

在有效負載中,我們有字節數組而不是 JSON。

我想要這樣的GenericMessage [payload={"id": "254335d0-b631-454e-98de-d2d5129af4c0", "type": "ObjectClass", "delta"...

cloud:
  stream:
    function:
      definition: dispatchConsumerFromTempoComposerEvent
    bindings:
      dispatchConsumerFromTempoComposerEvent-in-0:
        destination: tempo-composer-event
      dispatchConsumerFromTempoComposerEvent-out-0:
        destination: tempo-composer-event
        contentType: application/json  --> i try to add this 
    kafka:
      binder:
        auto-create-topics: false
        consumer-properties:
          value:
            subject:
              name:
                strategy: io.confluent.kafka.serializers.subject.TopicRecordNameStrategy
          key.deserializer: org.apache.kafka.common.serialization.StringDeserializer
          value.deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
          schema.registry.url: http://localhost:8081
          specific.avro.reader: true
        producer-properties:
          value:
            subject:
              name:
                strategy: io.confluent.kafka.serializers.subject.TopicRecordNameStrategy --> i try to add this
          key.serializer: org.apache.kafka.common.serialization.StringSerializer
          value.serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
          schema.registry.url: http://localhost:8081

我想將項目遷移到 spring 雲 stream。“遺留”代碼:

private static final JsonGenericRecordReader recordReader = new JsonGenericRecordReader(new CompositeJsonToAvroReader(List.of(), IGNORE_UNKNOWN_FIELD));
private static KafkaProducer<String, Object> buildProducer() {
        final var config = new Properties();
        config.put("bootstrap.servers", KafkaConfiguration.kafkaHost());
        config.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        config.put("value.serializer", "io.confluent.kafka.serializers.KafkaAvroSerializer");
        config.put("schema.registry.url", "http://" + KafkaConfiguration.schemaRegistryHost());
        config.put("value.subject.name.strategy", TopicRecordNameStrategy.class.getName());
        return new KafkaProducer<>(config);
    }
final var getClassSchema = avroClass.getMethod("getClassSchema");
            final var specificRecord = recordReader.read(record.getBytes(StandardCharsets.UTF_8), (Schema) getClassSchema.invoke(null));
            final var producerRecord = new ProducerRecord<String, Object>(topic, key, specificRecord);
            if (TestContext.traceId() != null) {
                producerRecord.headers().add("b3", (TestContext.traceId() + "-" + TestContext.traceId() + "-1").getBytes());
            }
            headers.forEach((name, value) -> producerRecord.headers().add(name, value.getBytes()));
            TestContext.additionalKafkaHeaders().forEach((name, value) -> producerRecord.headers().add(name, value.getBytes()));

            RecordMetadata recordMetadata = getProducer().send(producerRecord).get();
            return recordMetadata;

這個遺留代碼產生

GenericMessage [payload={"id": "efb90cd6-e022-4d82-9898-6b78114cfb01", "type": "FirstRankPaymentAgreed",...},  headers={deliveryAttempt=1, kafka_timestampType=CREATE_TIME, kafka_receivedMessageKey=cucumber-test1, contentType=application/json...}]

有效載荷為 JSON 格式。 在我遷移 spring 雲 stream 之后,我有 [payload=byte[2151]...

我只是使用模式注冊表。 而是使用消息負載來檢索 class 並獲取模式以將字節數組轉換為最終的 avro 模式。 我使用架構注冊表和 ID。 忽略口渴字節並獲取架構 ID 以找到正確的架構。 您可以只注釋您的 application.class 並添加 @EnableSchemaRegistry。 您的簽名 Function 看起來像這樣。

@Bean
public Function<Flux<Message<Object>>, Flux<Message<Object>>>

Spring 雲 stream 讀取字節數組獲取架構 ID。 在模式注冊表中找到正確的模式並將字節數組轉換為正確的 class。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM