简体   繁体   English

Jar file has the class, still I get java.lang.ClassNotFoundException: org.apache.kafka.clients.consumer.ConsumerRecord

[英]Jar file has the class, still I get java.lang.ClassNotFoundException: org.apache.kafka.clients.consumer.ConsumerRecord

I am running spark streaming job to consume from kafka using Direct approach (for kafka 0.1.0 or greater).我正在使用 Direct 方法(对于 kafka 0.1.0 或更高版本)运行火花流作业以从 kafka 消费。 Built POM file using maven-assembly-plugin and checked the contents of jar file using jar tf <jar file> | grep ConsumerRecord使用maven-assembly-plugin构建 POM 文件并使用jar tf <jar file> | grep ConsumerRecord检查 jar 文件的内容jar tf <jar file> | grep ConsumerRecord . jar tf <jar file> | grep ConsumerRecord I get the following output我得到以下 output

org/apache/kafka/clients/consumer/ConsumerRecord.class org/apache/kafka/clients/consumer/ConsumerRecords$ConcatenatedIterable$1.class org/apache/kafka/clients/consumer/ConsumerRecords$ConcatenatedIterable.class org/apache/kafka/clients/consumer/ConsumerRecords.class org/apache/kafka/clients/consumer/ConsumerRecord.class org/apache/kafka/clients/consumer/ConsumerRecords$ConcatenatedIterable$1.class org/apache/kafka/clients/consumer/ConsumerRecords$ConcatenatedIterable.class org/apache/kafka/客户/消费者/ConsumerRecords.class

But when i run spark-submit job on my cluster (with master as both local & yarn), I get the following exception.但是当我在我的集群上运行 spark-submit 作业时(master 作为本地和纱线),我得到以下异常。

java.lang.ClassNotFoundException: org.apache.kafka.clients.consumer.ConsumerRecord java.lang.ClassNotFoundException:org.apache.kafka.clients.consumer.ConsumerRecord

Other option that I tried is - Built a shaded jar using maven-shade-plugin .我尝试过的其他选项是 - 使用maven-shade-plugin构建了一个阴影 jar 。 Same result there as well.那里的结果也一样。

PFB my POM file PFB 我的 POM 文件

<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.myCompany</groupId>
<artifactId>spark-streaming-test</artifactId>
<version>1</version>

<dependencies>
    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.4.5</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.11</artifactId>
        <version>2.4.5</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10 -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
        <version>2.4.5</version>
    </dependency>

    <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka-clients</artifactId>
        <version>2.2.1</version>
    </dependency>

</dependencies>

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>3.6.0</version>
            <configuration>
                <source>1.8</source>
                <target>1.8</target>
            </configuration>
        </plugin>

        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                </execution>
            </executions>
            <configuration>
                <finalName>shade-${artifactId}-${version}</finalName>
            </configuration>
        </plugin>

        <plugin>
            <artifactId>maven-assembly-plugin</artifactId>
            <configuration>
                <archive>
                    <manifest>
                        <mainClass>com.myCompany.ReadFromKafka</mainClass>
                    </manifest>
                </archive>
                <descriptorRefs>
                    <descriptorRef>jar-with-dependencies</descriptorRef>
                </descriptorRefs>
            </configuration>
            <executions>
                <execution>
                    <id>make-assembly</id> <!-- this is used for inheritance merges -->
                    <phase>package</phase> <!-- bind to the packaging phase -->
                    <goals>
                        <goal>single</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>

    </plugins>
</build>

And here is my spark streaming code (taken from - https://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.html )这是我的火花流代码(取自-https://spark.apache.org/docs/latest/streaming-kafka-0-10-integration.ZFC35FDC70D5FC69D7A569883A82E

package com.myCompany;

import java.util.*;
import org.apache.spark.SparkConf;
import org.apache.spark.TaskContext;
import org.apache.spark.api.java.*;
import org.apache.spark.api.java.function.*;
import org.apache.spark.streaming.Durations;
import org.apache.spark.streaming.StreamingContext;
import org.apache.spark.streaming.api.java.*;
import org.apache.spark.streaming.kafka010.*;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.common.TopicPartition;
import org.apache.kafka.common.serialization.StringDeserializer;
import scala.Tuple2;

public class ReadFromKafka {

    public static void main(String args[]) throws InterruptedException {

        SparkConf conf = new SparkConf();// .setAppName("Decryption-spark-streaming").setMaster("yarn");
        JavaStreamingContext jsc = new JavaStreamingContext(conf, Durations.seconds(5));

        Map<String, Object> kafkaParams = new HashMap<String, Object>();
        kafkaParams.put("bootstrap.servers", "server1:9093");
        kafkaParams.put("key.deserializer", StringDeserializer.class);
        kafkaParams.put("value.deserializer", StringDeserializer.class);
        kafkaParams.put("group.id", "my_cg");
        kafkaParams.put("auto.offset.reset", "earliest");
        kafkaParams.put("enable.auto.commit", false);
        kafkaParams.put("security.protocol", "SSL");
        kafkaParams.put("ssl.truststore.location", "abc.jks");
        kafkaParams.put("ssl.truststore.password", "changeit");
        kafkaParams.put("ssl.keystore.location", "abc.jks");
        kafkaParams.put("ssl.keystore.password", "changeme");
        kafkaParams.put("ssl.key.password", "changeme");

        Collection<String> topics = Arrays.asList("myTopic");

        JavaInputDStream<ConsumerRecord<String, String>> stream = KafkaUtils.createDirectStream(jsc,
                LocationStrategies.PreferConsistent(),
                ConsumerStrategies.<String, String>Subscribe(topics, kafkaParams));

        stream.mapToPair(record -> new Tuple2<>(record.key(), record.value()));

        stream.foreachRDD(rdd -> {
            OffsetRange[] offsetRanges = ((HasOffsetRanges) rdd.rdd()).offsetRanges();
            rdd.foreachPartition(consumerRecords -> {
                OffsetRange o = offsetRanges[TaskContext.get().partitionId()];
                System.out.println(o.topic() + " " + o.partition() + " " + o.fromOffset() + " " + o.untilOffset());
            });
        });

        stream.foreachRDD(rdd -> {
            OffsetRange[] offsetRanges = ((HasOffsetRanges) rdd.rdd()).offsetRanges();

            // some time later, after outputs have completed
            ((CanCommitOffsets) stream.inputDStream()).commitAsync(offsetRanges);
        });

        // Start the computation
        jsc.start();
        jsc.awaitTermination();
    }}

Adding the dependent jar file( spark-streaming-kafka-0-10_2.11.jar ) to the spark-submit command helped in resolving this issue将依赖的 jar 文件( spark-streaming-kafka-0-10_2.11.jar )添加到spark-submit命令有助于解决此问题

spark-submit --master yarn --deploy-mode cluster --name spark-streaming-test\
--executor-memory 1g --num-executors 4 --driver-memory 1g --jars\
/home/spark/jars/spark-streaming-kafka-0-10_2.11.jar --class\
com.mycompany.ReadFromKafka spark-streaming-test-1-jar-with-dependencies.jar

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Java中的对象不可序列化(org.apache.kafka.clients.consumer.ConsumerRecord)spark kafka流 - Object not serializable (org.apache.kafka.clients.consumer.ConsumerRecord) in Java spark kafka streaming java.lang.ClassNotFoundException: org.apache.kafka.clients.consumer.ConsumerGroupMetadata - java.lang.ClassNotFoundException: org.apache.kafka.clients.consumer.ConsumerGroupMetadata 包含hadoop / *-common.jar仍然有“ java.lang.ClassNotFoundException:org.apache.hadoop.conf.Configuration” - Have hadoop/*-common.jar included still got “java.lang.ClassNotFoundException:org.apache.hadoop.conf.Configuration” 即使该类存在于jar文件中,也会引发java.lang.ClassNotFoundException - Throwing java.lang.ClassNotFoundException even if the class exists in jar file 打包并运行应用程序后,我得到java.lang.ClassNotFoundException:org.apache.hive.jdbc.HiveDriver - after packaging and running app I get java.lang.ClassNotFoundException: org.apache.hive.jdbc.HiveDriver 从jar文件加载类时出现java.lang.ClassNotFoundException - java.lang.ClassNotFoundException when loading a class from a jar file java.lang.ClassNotFoundException: org.apache.log4j.Logger - Eclipse still searching for old class after upgrading from version 1 to 2.17.0 - java.lang.ClassNotFoundException: org.apache.log4j.Logger - Eclipse still searching for old class after upgrading from version 1 to 2.17.0 带jar文件的Crontab java.lang.ClassNotFoundException - Crontab java.lang.ClassNotFoundException with a jar file java.lang.ClassNotFoundException: org.apache.kafka.common.metrics.MetricsContext - java.lang.ClassNotFoundException: org.apache.kafka.common.metrics.MetricsContext Maven可部署jar“ java.lang.ClassNotFoundException:org.apache.hadoop.hive.jdbc.HiveDriver”错误 - maven deployable jar “java.lang.ClassNotFoundException: org.apache.hadoop.hive.jdbc.HiveDriver” error
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM