簡體   English   中英

Flink 連接運算符未顯示任何 output

[英]Flink join operator not showing any output

我是 Flink 的新手。 我有兩個數據流,我想在翻滾的 Window 中應用鍵控連接。所有代碼都可以正常工作,但連接永遠不會給出任何結果。 我什至在加入的數據流和fromSource上應用assignTimestampsAndWatermarks

KafkaSource<ConsumerRecord> iotA = KafkaSource.<ConsumerRecord>builder()
                .setBootstrapServers(IP)
                .setTopics("iotA")
                .setStartingOffsets(OffsetsInitializer.latest())
                .setDeserializer(KafkaRecordDeserializationSchema.of(new KafkaDeserializationSchema<ConsumerRecord>() {
                    @Override
                    public boolean isEndOfStream(ConsumerRecord record) {
                        return false;
                    }

                    @Override
                    public ConsumerRecord deserialize(ConsumerRecord<byte[], byte[]> record) throws Exception {
                        String key = new String(record.key(), StandardCharsets.UTF_8);
                        String value = new String(record.value(), StandardCharsets.UTF_8);
                        return new ConsumerRecord(
                                record.topic(),
                                record.partition(),
                                record.offset(),
                                record.timestamp(),
                                record.timestampType(),
                                record.checksum(),
                                record.serializedKeySize(),
                                record.serializedValueSize(),
                                key,
                                value
                        );
                    }

                    @Override
                    public TypeInformation<ConsumerRecord> getProducedType() {
                        TypeInformation<ConsumerRecord> typeInfo = TypeInformation.of(ConsumerRecord.class);
                        return typeInfo;
                    }
                }))
                .build();
        KafkaSource<ConsumerRecord> iotB = //same as iotA
   
        DataStream<ConsumerRecord> iotA_datastream = env.fromSource(iotA,
                WatermarkStrategy.<ConsumerRecord>forMonotonousTimestamps()
                .withTimestampAssigner((record, timestamp) -> record.timestamp()), "Kafka Source");
    
        DataStream<ConsumerRecord> iotB_datastream = //same as iotA_datastream

        DataStream<ConsumerRecord> mapped_iotA = iotA_datastream.map(new MapFunction<ConsumerRecord, ConsumerRecord>() {
            @Override
            public ConsumerRecord map(ConsumerRecord record) throws Exception {
                String new_value = splitValue((String) record.value(), 0);
                return new ConsumerRecord(record.topic(), record.partition(), record.offset(), record.timestamp(), record.timestampType(),
                        record.checksum(), record.serializedKeySize(), record.serializedValueSize(), record.key(), new_value);
            }
        }).assignTimestampsAndWatermarks(WatermarkStrategy.<ConsumerRecord>forMonotonousTimestamps()
                .withTimestampAssigner((record, timestamp) -> record.timestamp()));

        DataStream<ConsumerRecord> mapped_iotB = //same as mapped_iotA

        DataStream<String> joined_stream= mapped_iotA.join(mapped_iotB)
                .where(new KeySelector<ConsumerRecord, String>() {
                    @Override
                    public String getKey(ConsumerRecord record) throws Exception {
                        System.out.println((String) record.key()+record.value());
                        return (String) record.key();
                    }
                })
                .equalTo(new KeySelector<ConsumerRecord, String>() {
                    @Override
                    public String getKey(ConsumerRecord record) throws Exception {
                        System.out.println((String) record.key()+record.value());
                        return (String) record.key();
                    }
                })
                .window(TumblingEventTimeWindows.of(Time.seconds(5)))
                .apply(new JoinFunction<ConsumerRecord, ConsumerRecord, String> (){
                    @Override
                    public String join(ConsumerRecord record1, ConsumerRecord record2) throws Exception {//doesnt show anything
                        System.out.println("value1" + record1.value() + "value2" + record2.value());
                        return "null";
                    }
                        });

        env.execute();

我還嘗試了其他水印策略,例如forBoundedOutOfOrderness和更寬的 windows 時間,結果相同

嘗試在作業上安裝一個水槽。 例如,

DataStream<String> joined_stream= mapped_iotA.join(mapped_iotB)
  ...

joined_stream.print();
env.execute();

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM