繁体   English   中英

使用kafka + beam + flink时。 找到了接口org.apache.flink.streaming.api.operators.InternalTimer,但是期望使用类

[英]When using kafka+beam+flink. Found interface org.apache.flink.streaming.api.operators.InternalTimer, but class was expected

当我尝试使用kafka从购物系统发布/订阅数据,使用Beam设计数据流来编写自己的apache Beam演示时,请在flink上运行。 我遇到了一个非常罕见的例外:

Caused by: java.lang.IncompatibleClassChangeError: Found interface org.apache.flink.streaming.api.operators.InternalTimer, but class was expected at org.apache.beam.runners.flink.translation.wrappers.streaming.WindowDoFnOperator.fireTimer(WindowDoFnOperator.java:129) at org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator.onProcessingTime(DoFnOperator.java:704) at org.apache.flink.streaming.api.operators.InternalTimerServiceImpl.onProcessingTime(InternalTimerServiceImpl.java:235) at org.apache.flink.streaming.runtime.tasks.SystemProcessingTimeService$TriggerTask.run(SystemProcessingTimeService.java:285)

我的代码是:

    package com.meikeland.dataflow;

    import org.apache.beam.runners.flink.FlinkRunner;
    import org.apache.beam.sdk.Pipeline;
    import org.apache.beam.sdk.io.kafka.KafkaIO;
    import org.apache.beam.sdk.options.PipelineOptionsFactory;
    import org.apache.beam.sdk.transforms.*;
    import org.apache.beam.sdk.transforms.windowing.*;
    import org.apache.beam.sdk.values.KV;
    import org.apache.kafka.common.serialization.LongDeserializer;
    import org.apache.kafka.common.serialization.StringDeserializer;
    import org.apache.kafka.common.serialization.StringSerializer;
    import org.joda.time.Duration;
    import org.slf4j.Logger;
    import org.slf4j.LoggerFactory;

    public class GameStats {

      private static final Logger logger = LoggerFactory.getLogger(GameStats.class);

      public static void main(String[] args) {
        KFOptions options = PipelineOptionsFactory.fromArgs(args).as(KFOptions.class);
        options.setRunner(FlinkRunner.class);
        options.setStreaming(true);
        logger.info("brokers address is: {}", options.getBrokers());
        runDemoCount(options);
      }

      private static void runDemoCount(KFOptions options) {
        Pipeline pipeline = Pipeline.create(options);

        pipeline
            // read order events from kafka
            .apply("ConsumeKafka",
                KafkaIO.<Long, String>read().withBootstrapServers(options.getBrokers()).withTopic("tracking.order.goods")
                    .withKeyDeserializer(LongDeserializer.class).withValueDeserializer(StringDeserializer.class)
                    .withLogAppendTime().withoutMetadata())
            .apply(Values.create()).apply("ParseOrderInfo", ParDo.of(new ParseOrderInfoFn()))
            .apply("SetTimestamp", WithTimestamps.of(OrderInfo::getCreatedAt))
            .apply("ExtractOrderID", MapElements.via(new SimpleFunction<OrderInfo, Integer>() {
              public Integer apply(OrderInfo o) {
                logger.info("processed orderID: {}", o.getOrderID());
                return o.getOrderID();
              }
            }))
            // window
            .apply("FixedWindowsOrderID",
                Window.<Integer>into(FixedWindows.of(new Duration(1000 * 60)))
                    .triggering(AfterWatermark.pastEndOfWindow()
                        .withEarlyFirings(AfterProcessingTime.pastFirstElementInPane().plusDelayOf(new Duration(1000 * 60)))
                        .withLateFirings(AfterPane.elementCountAtLeast(1)))
                    .withAllowedLateness(new Duration(1000 * 60)).accumulatingFiredPanes())
            .apply("Count", Count.<Integer>perElement()).apply("ToString", ParDo.of(new DoFn<KV<Integer, Long>, String>() {
              @ProcessElement
              public void processElement(@Element KV<Integer, Long> element, IntervalWindow window,
                  OutputReceiver<String> r) {
                logger.info("the order is : {}, and count is : {}", element.getKey(), element.getValue());
                r.output(String.format("interval :%s, Order ID: %d, Count :%d", window.start().toString(), element.getKey(),
                    element.getValue()));
              }
            })).apply("WriteToKafka", KafkaIO.<Void, String>write().withBootstrapServers(options.getBrokers())
                .withTopic("streaming.order.count").withValueSerializer(StringSerializer.class).values());

        pipeline.run().waitUntilFinish();
      }
    }

似乎错误在窗口中,但我无法弄清楚。 而且我到处都是google,似乎没人遇到类似的错误。 因此,我必须将一些小问题弄错了。 请谁能救我。

我遇到了同样的问题,并通过检查flink的版本是否与Beam兼容来解决了该问题:

https://beam.apache.org/documentation/runners/flink/

就我而言,我有Beam 2.6和flink 1.5.4。

希望对您有帮助。

问候,阿里

我也有这个问题,终于解决了。

如果您的项目取决于

“ org.apache.beam”%“ beam-runners-flink”%beamVersion

使用InternalTimer Class

我看一下org.apache.flink.streaming的scala API文档, InternalTimerFlink 1.6之后成为Interface

为了在Flink 1.6之后正确使用带有InternalTimer Interface Apache Beam FlinkRunner ,您的项目必须依赖

“ org.apache.beam”%“ beam-runners-flink-1.6”%beamVersion

要么

“ org.apache.beam”%“ beam-runners-flink-1.7”%beamVersion

要么

“ org.apache.beam”%“ beam-runners-flink-1.8”%beamVersion

一切都会很棒

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM