簡體   English   中英

Flink 應用程序在 Java 中拋出 Class Not Found 異常

[英]Flink Application throw Class Not Found Exception in Java

我有一個帶有 Yarn 的 Flink 集群,使用 flink-quickstart-java Archetype 來構建一個演示項目。 使用'mvn clean package -Pbuild-jar'命令構建fat-jar,並使用'flink run -m yarn-cluster -yn 2 ./flink-SNAPSHOT-1.0.jar'提交程序后,程序拋出以下內容例外:

java.lang.NoClassDefFoundError: java.lang.NoClassDefFoundError: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09.setDeserializer(FlinkKafkaConsumer09.java:290) 的org/apache/kafka/common/serialization/ByteArrayDeserializer org.apache .flink.streaming.connectors.kafka.FlinkKafkaConsumer09.(FlinkKafkaConsumer09.java:216) 在 org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09.(FlinkKafkaConsumer09.java:154) 在 org.apache.flink.streaming.connectors .kafka.FlinkKafkaConsumer010.(FlinkKafkaConsumer010.java:128) 在 org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010.(FlinkKafkaConsumer010.java:112) 在 org.apache.flink.streaming.connectors.0Kafkaer.connectors.kafkafka. FlinkKafkaConsumer010.java:79) at stream.TransferKafka.main(TransferKafka.java:19) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect .DelegatingMethodAccessorImpl.invoke(Delegatin gMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:525) at org.apache.flink .client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:417) at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:395) at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend) .java:828) 在 org.apache.flink.client.CliFrontend.run(CliFrontend.java:283) 在 org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1080) 在 org.apache.flink。 client.CliFrontend$1.call(CliFrontend.java:1127) at org.apache.flink.client.CliFrontend$1.call(CliFrontend.java:1124) at java.security.AccessController.doPrivileged(Native Method) at javax.security。 auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1781) at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSe) curityContext.java:41) 在 org.apache.flink.client.CliFrontend.main(CliFrontend.java:1124) 引起:java.lang.ClassNotFoundException:org.apache.kafka.common.serialization.ByteArrayDeserializer 在 java.net。 URLClassLoader.findClass(URLClassLoader.java:381) 在 java.lang.ClassLoader.loadClass(ClassLoader.java:424) 在 sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338) 在 java.lang.ClassLoader.loadClass (ClassLoader.java:357) ... 24 更多

這是我的演示:

public static void main(String[] args) {
 final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
 Properties props = new Properties();
 props.setProperty("bootstrap.servers", "ip:port");
 props.setProperty("group.id", "NewFlinkTest");
 DataStreamSource < String > stream = env.addSource(new FlinkKafkaConsumer010 < > ("kafka_test", new SimpleStringSchema(), props));
 stream.addSink(new FlinkKafkaProducer010 < > ("kafka_test_out", new SimpleStringSchema(), props));
 try {
  env.execute("Flink Jar Test");
 } catch (Exception e) {
  e.printStackTrace();
 }
}

以及一些版本信息:
FLink 版本:1.4.0

Hadoop 版本:2.7.2

卡夫卡版本:0.10.2.1

JDK 版本:1.8


Pom 依賴

編輯1:

<?xml version="1.0" encoding="UTF-8"?>
<dependencies>
   <!-- Apache Flink dependencies -->
   <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-core</artifactId>
      <version>${flink.version}</version>
   </dependency>
   <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-java</artifactId>
      <version>${flink.version}</version>
   </dependency>
   <dependency>
      <!-- This dependency is required to actually execute jobs. It is currently pulled in by           flink-streaming-java, but we explicitly depend on it to safeguard against future changes. -->
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-clients_${scala.binary.version}</artifactId>
      <version>${flink.version}</version>
   </dependency>
   <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
      <version>${flink.version}</version>
   </dependency>
   <!-- explicitly add a standard logging framework, as Flink does not have     a hard dependency on one specific framework by default -->
   <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-log4j12</artifactId>
      <version>${slf4j.version}</version>
   </dependency>
   <dependency>
      <groupId>log4j</groupId>
      <artifactId>log4j</artifactId>
      <version>${log4j.version}</version>
   </dependency>
   <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-connector-rabbitmq_2.11</artifactId>
      <version>1.4.0</version>
   </dependency>
   <dependency>
      <groupId>org.apache.flink</groupId>
      <artifactId>flink-connector-kafka-0.10_${scala.binary.version}</artifactId>
      <version>1.4.0</version>
   </dependency>
</dependencies>

經過一些嘗試,我發現代碼拋出異常與我打包到我的 uber-jar 中的 jar 不同。 我認為主要原因是客戶端服務器有舊版本的 flink-connector-kafka jar,但無論我如何設置配置 yaml 屬性“ yaml.per-job-cluster.include-user-jar ”,程序總是拋出相同的異常。


編輯2:

將 kafka-clients:0.10.2.1 添加到 flink_home/lib/ 后,就可以了。 但是仍然不知道為什么它不讀取uber jar中的類文件。

首先,您可以通過grep 'ByteArrayDeserializer' ./flink-SNAPSHOT-1.0.jar來驗證缺少的類是否在您的 jar 文件中。

您可能想將<scope>provided</scope>到 flink-streaming-scala、flink-clients、link-table-api-scala-bridge 和 flink-table-planner-blink - 這解決了我的問題

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM