[英]NoClassDefFoundError: org/apache/spark/sql/DataFrame in spark-cassandra-connector
I'm trying to upgrade spark-cassandra-connector
from 1.4
to 1.5
. 我正在尝试将spark-cassandra-connector
从1.4
升级到1.5
。
Everything seems fine but when I run test cases then It stuck between the process and log some error message saying: 一切似乎都很好但是当我运行测试用例然后它停留在进程和记录一些错误消息说:
Exception in thread "dag-scheduler-event-loop" java.lang.NoClassDefFoundError: org/apache/spark/sql/DataFrame 线程“dag-scheduler-event-loop”中的异常java.lang.NoClassDefFoundError:org / apache / spark / sql / DataFrame
My pom file looks like: 我的pom文件看起来像:
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.10 -->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.5.0</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>16.0.1</version>
</dependency>
<!-- Scala Library -->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.10.5</version>
</dependency>
<!--Spark Cassandra Connector-->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.5.0</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.5.0</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.0.2</version>
</dependency>
<!--Spark-->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.5.0</version>
<exclusions>
<exclusion>
<groupId>net.java.dev.jets3t</groupId>
<artifactId>jets3t</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
</project>
Thank you in advance!! 先感谢您!!
Can anyone please help me with this ? 有人可以帮我这个吗? If you need more info please let me know!! 如果您需要更多信息,请告诉我!
Try to add dependency 尝试添加依赖项
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
Also make sure that your version spark-cassandra-connector
is compatible with version of Spark you're using. 还要确保您的版本spark-cassandra-connector
与您正在使用的Spark版本兼容。 I had the same error message even with all proper dependencies when was trying to use older spark-cassandra-connector
with newer Spark version. 当尝试使用较新的Spark版本的旧版spark-cassandra-connector
时,即使有所有正确的依赖项,我也会收到相同的错误消息。 Refer to this table: https://github.com/datastax/spark-cassandra-connector#version-compatibility 请参阅此表: https : //github.com/datastax/spark-cassandra-connector#version-compatibility
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.