简体   繁体   English

使用kafka + beam + flink时。 找到了接口org.apache.flink.streaming.api.operators.InternalTimer,但是期望使用类

[英]When using kafka+beam+flink. Found interface org.apache.flink.streaming.api.operators.InternalTimer, but class was expected

When I try to write my own apache beam demo by using kafka to pub/sub data from shopping system, using beam to design the data flow, run on flink. 当我尝试使用kafka从购物系统发布/订阅数据,使用Beam设计数据流来编写自己的apache Beam演示时,请在flink上运行。 I stuck on a very rare exception: 我遇到了一个非常罕见的例外:

Caused by: java.lang.IncompatibleClassChangeError: Found interface org.apache.flink.streaming.api.operators.InternalTimer, but class was expected at org.apache.beam.runners.flink.translation.wrappers.streaming.WindowDoFnOperator.fireTimer(WindowDoFnOperator.java:129) at org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator.onProcessingTime(DoFnOperator.java:704) at org.apache.flink.streaming.api.operators.InternalTimerServiceImpl.onProcessingTime(InternalTimerServiceImpl.java:235) at org.apache.flink.streaming.runtime.tasks.SystemProcessingTimeService$TriggerTask.run(SystemProcessingTimeService.java:285)

my code is: 我的代码是:

    package com.meikeland.dataflow;

    import org.apache.beam.runners.flink.FlinkRunner;
    import org.apache.beam.sdk.Pipeline;
    import org.apache.beam.sdk.io.kafka.KafkaIO;
    import org.apache.beam.sdk.options.PipelineOptionsFactory;
    import org.apache.beam.sdk.transforms.*;
    import org.apache.beam.sdk.transforms.windowing.*;
    import org.apache.beam.sdk.values.KV;
    import org.apache.kafka.common.serialization.LongDeserializer;
    import org.apache.kafka.common.serialization.StringDeserializer;
    import org.apache.kafka.common.serialization.StringSerializer;
    import org.joda.time.Duration;
    import org.slf4j.Logger;
    import org.slf4j.LoggerFactory;

    public class GameStats {

      private static final Logger logger = LoggerFactory.getLogger(GameStats.class);

      public static void main(String[] args) {
        KFOptions options = PipelineOptionsFactory.fromArgs(args).as(KFOptions.class);
        options.setRunner(FlinkRunner.class);
        options.setStreaming(true);
        logger.info("brokers address is: {}", options.getBrokers());
        runDemoCount(options);
      }

      private static void runDemoCount(KFOptions options) {
        Pipeline pipeline = Pipeline.create(options);

        pipeline
            // read order events from kafka
            .apply("ConsumeKafka",
                KafkaIO.<Long, String>read().withBootstrapServers(options.getBrokers()).withTopic("tracking.order.goods")
                    .withKeyDeserializer(LongDeserializer.class).withValueDeserializer(StringDeserializer.class)
                    .withLogAppendTime().withoutMetadata())
            .apply(Values.create()).apply("ParseOrderInfo", ParDo.of(new ParseOrderInfoFn()))
            .apply("SetTimestamp", WithTimestamps.of(OrderInfo::getCreatedAt))
            .apply("ExtractOrderID", MapElements.via(new SimpleFunction<OrderInfo, Integer>() {
              public Integer apply(OrderInfo o) {
                logger.info("processed orderID: {}", o.getOrderID());
                return o.getOrderID();
              }
            }))
            // window
            .apply("FixedWindowsOrderID",
                Window.<Integer>into(FixedWindows.of(new Duration(1000 * 60)))
                    .triggering(AfterWatermark.pastEndOfWindow()
                        .withEarlyFirings(AfterProcessingTime.pastFirstElementInPane().plusDelayOf(new Duration(1000 * 60)))
                        .withLateFirings(AfterPane.elementCountAtLeast(1)))
                    .withAllowedLateness(new Duration(1000 * 60)).accumulatingFiredPanes())
            .apply("Count", Count.<Integer>perElement()).apply("ToString", ParDo.of(new DoFn<KV<Integer, Long>, String>() {
              @ProcessElement
              public void processElement(@Element KV<Integer, Long> element, IntervalWindow window,
                  OutputReceiver<String> r) {
                logger.info("the order is : {}, and count is : {}", element.getKey(), element.getValue());
                r.output(String.format("interval :%s, Order ID: %d, Count :%d", window.start().toString(), element.getKey(),
                    element.getValue()));
              }
            })).apply("WriteToKafka", KafkaIO.<Void, String>write().withBootstrapServers(options.getBrokers())
                .withTopic("streaming.order.count").withValueSerializer(StringSerializer.class).values());

        pipeline.run().waitUntilFinish();
      }
    }

It seems the error is in the window, but I can't figure it out. 似乎错误在窗口中,但我无法弄清楚。 And I google everywhere, no one seems came across similar error. 而且我到处都是google,似乎没人遇到类似的错误。 So I must be make some little thing wrong. 因此,我必须将一些小问题弄错了。 Please who can save me. 请谁能救我。

I had the same issue and I fixed it by checking if the version of flink is compatible with Beam: 我遇到了同样的问题,并通过检查flink的版本是否与Beam兼容来解决了该问题:

https://beam.apache.org/documentation/runners/flink/ https://beam.apache.org/documentation/runners/flink/

In my case I have Beam 2.6 and flink 1.5.4. 就我而言,我有Beam 2.6和flink 1.5.4。

I hope it will help you. 希望对您有帮助。

Regards, Ali 问候,阿里

I also have this problem and finally figure it out. 我也有这个问题,终于解决了。

If your project depends on 如果您的项目取决于

"org.apache.beam" % "beam-runners-flink" % beamVersion “ org.apache.beam”%“ beam-runners-flink”%beamVersion

which uses the InternalTimer Class 使用InternalTimer Class

I take a look of scala API document of org.apache.flink.streaming and the InternalTimer become Interface after Flink 1.6 . 我看一下org.apache.flink.streaming的scala API文档, InternalTimerFlink 1.6之后成为Interface

In order to properly use Apache Beam FlinkRunner with InternalTimer Interface after Flink 1.6 , your project has to depend on 为了在Flink 1.6之后正确使用带有InternalTimer Interface Apache Beam FlinkRunner ,您的项目必须依赖

"org.apache.beam" % "beam-runners-flink-1.6" % beamVersion “ org.apache.beam”%“ beam-runners-flink-1.6”%beamVersion

or 要么

"org.apache.beam" % "beam-runners-flink-1.7" % beamVersion “ org.apache.beam”%“ beam-runners-flink-1.7”%beamVersion

or 要么

"org.apache.beam" % "beam-runners-flink-1.8" % beamVersion “ org.apache.beam”%“ beam-runners-flink-1.8”%beamVersion

Everything would be great 一切都会很棒

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 类型错误:找不到 Java class 'org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer' - TypeError: Could not found the Java class 'org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer' Flink runner 上的 Beam:ClassNotFoundException:org.apache.beam.runners.flink.translation.wrappers.streaming.WorkItemKeySelector - Beam on Flink runner: ClassNotFoundException: org.apache.beam.runners.flink.translation.wrappers.streaming.WorkItemKeySelector 找不到org.apache.flink.api.common.serialization.DeserializationSchema的类文件 - class file for org.apache.flink.api.common.serialization.DeserializationSchema not found 带有 flink 的 apache 光束中的 CEP - CEP in apache beam with flink 使用 apache Flink 读取键控 Kafka 记录? - Read a keyed Kafka Record using apache Flink? Flink 作业中的 java.lang.NoClassDefFoundError: org/apache/flink/streaming/connectors/rabbitmq/common/RMQConnectionConfig - java.lang.NoClassDefFoundError: org/apache/flink/streaming/connectors/rabbitmq/common/RMQConnectionConfig in flink job Apache Flink Streaming窗口WordCount - Apache Flink Streaming window WordCount 尝试使用flink的Kafka使用者进行消费时,出现错误“ java.lang.NoSuchMethodError:org.apache.kafka.clients.consumer.KafkaConsumer.assign” - Getting Error “java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.assign” when tring to consume using flink's Kafka Consumer 在flink流中使用grok - Using grok in flink streaming Apache Beam Counter/Metrics 在 Flink WebUI 中不可用 - Apache Beam Counter/Metrics not available in Flink WebUI
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM