简体   繁体   English

Spark java.lang.NoClassDefFoundError中的spark-cassandra-connector错误:com / datastax / driver / core / ProtocolOptions $ Compression

[英]Error with spark-cassandra-connector in Spark java.lang.NoClassDefFoundError: com/datastax/driver/core/ProtocolOptions$Compression

I have this error when I try to connect to cassandra with spark-cassandra-connector: 当我尝试使用spark-cassandra-connector连接到cassandra时出现此错误:

Exception in thread "main" java.lang.NoClassDefFoundError: com/datastax/driver/core/ProtocolOptions$Compression at com.datastax.spark.connector.cql.CassandraConnectorConf$.(CassandraConnectorConf.scala:112) at com.datastax.spark.connector.cql.CassandraConnectorConf$.(CassandraConnectorConf.scala) at com.datastax.spark.connector.cql.CassandraConnector$.apply(CassandraConnector.scala:192) at com.datastax.spark.connector.SparkContextFunctions.cassandraTable$default$3(SparkContextFunctions.scala:48) at main.scala.TestSpark$.main(TestSpark.scala:19) at main.scala.TestSpark.main(TestSpark.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672) at org.apa 线程“主”中的异常java.lang.NoClassDefFoundError:com.datastax.spark.connector.cql.CassandraConnectorConf $。(CassandraConnectorConf.scala:112)上的com / datastax / driver / core / ProtocolOptions $ Compression在com.datastax.spark上.connector.cql.CassandraConnectorConf $。(CassandraConnectorConf.scala)位于com.datastax.spark.connector.cql.CassandraConnector $ .apply(CassandraConnector.scala:192)位于com.datastax.spark.connector.SparkContextFunctions.cassandraTable $ default $ 3 (SparkContextFunctions.scala:48)在main.scala.TestSpark $ .main(TestSpark.scala:19)在main.scala.TestSpark.main(TestSpark.scala)在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)在sun org.apache处java.lang.reflect.Method.invoke(Method.java:606)上的sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)上的.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)。 spark.deploy.SparkSubmit $ .org $ apache $ spark $ deploy $ SparkSubmit $$ runMain(SparkSubmit.scala:672)在org.apa che.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: com.datastax.driver.core.ProtocolOptions$Compression at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 15 more I have added the jar in the spark class path spark-cassandra-connector_2.11-1.5.0-M2.jar che.spark.deploy.SparkSubmit $ .doRunMain $ 1(SparkSubmit.scala:180)位于org.apache.spark.deploy.SparkSubmit $ .submit(SparkSubmit.scala:205)位于org.apache.spark.deploy.SparkSubmit $。 org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)处的main(SparkSubmit.scala:120)原因:java.lang.ClassNotFoundException:com.datastax.driver.core.ProtocolOptions $ Compression at java.net。 URL.java.net.URLClassLoader $ 1.run(URLClassLoader.java:366)(java.net.URLClassLoader $ 1.run(URLClassLoader.java:355)java.net.URLClassLoader.doPrivileged(本机方法)URLClassLoader.findClass(URLClassLoader。 java.lang.ClassLoader.loadClass(ClassLoader.java:425)上的java:354)sun.misc.Launcher $ AppClassLoader.loadClass(Launcher.java:308)上java.lang.ClassLoader.loadClass(ClassLoader.java:358) )...另外15个我在jar类中添加了jar类spark-cassandra-connector_2.11-1.5.0-M2.jar

I have added the dependencies in the sbt file: 我在sbt文件中添加了依赖项:

name := "Simple Project"

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"

libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.1"

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.5.0-M2"

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector-java" % "1.5.0-M2"

This is the scala program I try to execute: 这是我尝试执行的scala程序:

package main.scala


import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import com.datastax.spark.connector._

/**
 * Created by Simo on 01.12.15.
 */
object TestSpark {
  def main(args: Array[String]) {
   val conf = new SparkConf(true)
        .set("spark.cassandra.connection.host", "54.229.218.236")
        .setAppName("Simple Application")
    val sc= new SparkContext("local", "test", conf)
    val rdd = sc.cassandraTable("test", "kv")
    println(rdd.count)
    println(rdd.first)
    println(rdd.map(_.getInt("value")).sum)
  }
}

And this is how i run it: 这就是我的运行方式:

$ sbt package
$ $SPARK_HOME/bin/spark-submit --class "main.scala.TestSpark" target/scala-2.11/simple-project_2.11-1.0.jar

Can you help me to understand what I'm doing wrong? 您能帮助我了解我在做什么错吗?

Thanks! 谢谢!

Edit: 编辑:

I have tried to add the Datastax driver in the dependencies list and in the spark classpath: 我试图在依赖项列表和spark类路径中添加Datastax驱动程序:

libraryDependencies += "com.datastax.cassandra" % "cassandra-driver-core" % "2.1.9"
libraryDependencies += "com.datastax.cassandra" % "cassandra-driver-mapping" % "2.1.9"

The last error no longer appear but now I have another error: 最后一个错误不再出现,但现在我又遇到了一个错误:

Exception in thread "main" java.lang.NoSuchMethodError: scala.runtime.ObjectRef.zero()Lscala/runtime/ObjectRef; 线程“主”中的异常java.lang.NoSuchMethodError:scala.runtime.ObjectRef.zero()Lscala / runtime / ObjectRef; at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150) at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31) at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56) at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120) at com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:241) at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.tableDef(Cassandr 在com.datastax.spark.connector.cql.CassandraConnector $$ anonfun $ 2上的com.datastax.spark.connector.cql.CassandraConnector $ .com $ datastax $ spark $ connector $ cql $ CassandraConnector $$ createSession(CassandraConnector.scala)处。在com.datastax.spark.connector.cql上应用(CassandraConnector.scala:150)。在com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.com)上应用(CassandraConnector.scala:150) scala:31)位于com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)位于com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)位于com.datastax .spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)位于com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120)位于com.datastax.spark.connector.cql.Schema $ .fromCassandra(Schema.scala:241)在com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider $ class.tableDef(Cassandr aTableRowReaderProvider.scala:51) at com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef$lzycompute(CassandraTableScanRDD.scala:59) at com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef(CassandraTableScanRDD.scala:59) at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.verify(CassandraTableRowReaderProvider.scala:146) at com.datastax.spark.connector.rdd.CassandraTableScanRDD.verify(CassandraTableScanRDD.scala:59) at com.datastax.spark.connector.rdd.CassandraTableScanRDD.getPartitions(CassandraTableScanRDD.scala:143) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1919) at org.apache.spark.rdd.RDD.count(RDD.scala:1121) at main.scala.TestSpark$.main(TestSpark.scala:20) at main.scala.TestSpark.ma aTableRowReaderProvider.scala:51)位于com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef $ lzycompute(CassandraTableScanRDD.scala:59)位于com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef(CassandraTableScanRDD。 com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider $ class.verify(CassandraTableRowReaderProvider.scala:146)com.datastax.spark.connector.rdd.CassandraTableScanRDD.verify(CassandraTableScanRDD.scala:59)com.datastaxs。 org.apache.spark.rdd.RDD $$ anonfun $ partitions $ 2.apply(RDD.scala:239)上的.connector.rdd.CassandraTableScanRDD.getPartitions(CassandraTableScanRDD.scala:143)在org.apache.spark.rdd.RDD在org.apache.spark.rdd.RDD.partitions(RDD.scala:237)在scala的$$ anonfun $ partitions $ 2.apply(RDD.scala:237)在scala.Option.getOrElse(Option.scala:120)在org。 org.apache.spark.rdd.RDD.count(RDD.scala:1121)的apache.spark.SparkContext.runJob(SparkContext.scala:1919)的main.scala.TestSpark $ .main(TestSpark.scala:20)的main.scala.TestSpark.ma in(TestSpark.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) in(TestSpark.scala)位于sun.reflect.NativeMethodAccessorImpl.invoke0(本地方法)位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)位于sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) org.apache.spark.deploy.SparkSubmit $ .org $ apache $ spark $ deploy $ SparkSubmit $$ runMain(SparkSubmit.scala:672)处的.lang.reflect.Method.invoke(Method.java:606) org.apache.spark.deploy.SparkSubmit $ .submit(SparkSubmit.scala:205)上的.spark.deploy.SparkSubmit $ .doRunMain $ 1(SparkSubmit.scala:180)在org.apache.spark.deploy.SparkSubmit $ .main (SparkSubmit.scala:120)在org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Edit 2: Make scala 2.10.6 at compile time (same as the scala version of spark) The previous error no longer appear but i have this new error: 编辑2:在编译时使scala 2.10.6(与spark的scala版本相同)不再出现​​以前的错误,但是我有这个新错误:

Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/util/concurrent/AsyncFunction at com.datastax.spark.connector.cql.DefaultConnectionFactory$.clusterBuilder(CassandraConnectionFactory.scala:36) at com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85) at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150) at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31) at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56) at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81) at com.datastax.spark.conne 线程“主要” java.lang.NoClassDefFoundError中的异常:com.datastax.spark.connector.cql.DefaultConnectionFactory $ .clusterBuilder(CassandraConnectionFactory.scala:36)处的com / google / common / util / concurrent / AsyncFunction。 spark.connector.cql.DefaultConnectionFactory $ .createCluster(CassandraConnectionFactory.scala:85)位于com.datastax.spark.connector.cql.CassandraConnector $ .com $ datastax $ spark $ connector $ cql $ CassandraConnector $$ createSession(CassandraConnector.scala: 155)在com.datastax.spark.connector.cql.CassandraConnector $$ anonfun $ 2.apply(CassandraConnector.scala:150)在com.datastax.spark.connector.cql.CassandraConnector $$ anonfun $ 2.apply(CassandraConnector.scala: 150)位于com.datastax.spark.com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)的com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31) .connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)位于com.datastax.spark.conne ctor.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109) at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120) at com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:241) at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.tableDef(CassandraTableRowReaderProvider.scala:51) at com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef$lzycompute(CassandraTableScanRDD.scala:59) at com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef(CassandraTableScanRDD.scala:59) at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.verify(CassandraTableRowReaderProvider.scala:150) at com.datastax.spark.connector.rdd.CassandraTableScanRDD.verify(CassandraTableScanRDD.scala:59) at com.datastax.spark.connector.rdd.CassandraTableScanRDD.getPartitions(CassandraTableScanRDD.scala:143) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) at org.apache.spark.rd ctor.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)位于com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120)位于com.datastax.spark.connector.cql.Schema $ .fromCassandra (Schema.scala:241)在com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider $ class.tableDef(CassandraTableRowReaderProvider.scala:51)在com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef $ lzycompute(CassandraTableScanRDD :59),位于com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider $ class.verify(CassandraTableRowReaderProvider.scala:150)的com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef(CassandraTableScanRDD.scala:59)处。 org.apache.spark.rdd.RDD上的datastax.spark.connector.rdd.CassandraTableScanRDD.verify(CassandraTableScanRDD.scala:59)位于com.datastax.spark.connector.rdd.CassandraTableScanRDD.getPartitions(CassandraTableScanRDD.scala:143)在org.apache.spark.rdd.RDD $ $ anonfun $ partitions $ 2.apply(RDD.scala:239)在org.apache.spark.rd d.RDD$$anonfun$partitions$2.apply(RDD.scala:237) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1919) at org.apache.spark.rdd.RDD.count(RDD.scala:1121) at main.scala.TestSpark$.main(TestSpark.scala:20) at main.scala.TestSpark.main(TestSpark.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) at org.apache.spark.deploy.SparkSubm d.RDD $$ anonfun $ partitions $ 2.apply(RDD.scala:237)在scala.Option.getOrElse(Option.scala:120)在org.apache.spark.rdd.RDD.partitions(RDD.scala:237)在org.apache.spark.SparkContext.runJob(SparkContext.scala:1919)在org.apache.spark.rdd.RDD.count(RDD.scala:1121)在main.scala.TestSpark $ .main(TestSpark.scala: 20)位于sun.reflect.NativeMethodAccessorImpl.invoke的main.scala.TestSpark.main(TestSpark.scala)处(sun.reflect.NativeMethodAccessorImpl.invoke的(Native Method)处的(NativeMethod)在sun.reflect.DelegatingMethodAccessorImpl.invoke的(NativeMethodAccessorImpl.java:57) (DelegatingMethodAccessorImpl.java:43),位于org.apache.spark.deploy.SparkSubmit $ .org $ apache $ spark $ deploy $ SparkSubmit $$ runMain(SparkSubmit)的java.lang.reflect.Method.invoke(Method.java:606)。 .scala:672)在org.apache.spark.deploy.SparkSubmit $ .doRunMain $ 1(SparkSubmit.scala:180)在org.apache.spark.deploy.SparkSubmit $ .submit(SparkSubmit.scala:205)在org.apache .spark.deploy.SparkSubmit $ .main(SparkSubmit.scala:120)在org.apache.spark.deploy.SparkSubm it.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: com.google.common.util.concurrent.AsyncFunction at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 34 more it.main(SparkSubmit.scala)造成原因:java.lang.ClassNotFoundException:java.net.URLClassLoader $ 1.run(java.net.URLClassLoader $ 1.run(URLClassLoader.java:366)上的com.google.common.util.concurrent.AsyncFunction $ 1.run(URLClassLoader.java:355)在java.security.AccessController.doPrivileged(本机方法)在java.net.URLClassLoader.findClass(URLClassLoader.java:354)在java.lang.ClassLoader.loadClass(ClassLoader.java: 425)在sun.misc.Launcher $ AppClassLoader.loadClass(Launcher.java:308)在java.lang.ClassLoader.loadClass(ClassLoader.java:358)...另外34个

Finally resolved using sbt-assembly as suggested by @Odomontois 最终使用@Odomontois建议的sbt-assembly解决

This is the final build.sbt: 这是最终的build.sbt:

name := "Simple Project"

version := "1.0"

scalaVersion := "2.10.6"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1" % "provided"

libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.1" % "provided"

libraryDependencies += "com.datastax.cassandra" % "cassandra-driver-core" % "2.1.9"

libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.10" % "1.5.0-M2"



jarName in assembly :="my-project-assembly.jar"

assemblyOption in assembly := (assemblyOption in             assembly).value.copy(includeScala = false)


resolvers += "Akka Repository" at "http://repo.akka.io/releases/"

mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
    {
        case PathList("netty", "handler", xs @ _*)         => MergeStrategy.first
        case PathList("netty", "buffer", xs @ _*)     => MergeStrategy.first
        case PathList("netty", "common", xs @ _*)     => MergeStrategy.first
        case PathList("netty", "transport", xs @ _*)     => MergeStrategy.first
        case PathList("netty", "codec", xs @ _*)     => MergeStrategy.first

        case PathList("META-INF", "io.netty.versions.properties") => MergeStrategy.first
        case x => old(x)
        }
    }

您还需要从(按照火花卡桑德拉连接器的版本)添加Datastax卡桑德拉驱动的依赖性: - https://repo1.maven.org/maven2/com/datastax/cassandra/cassandra-driver-core/

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用spark-cassandra-connector时出错:java.lang.NoSuchMethodError - Error in using spark-cassandra-connector: java.lang.NoSuchMethodError Java,Spark和Cassandra java.lang.ClassCastException:com.datastax.driver.core.DefaultResultSetFuture无法转换为阴影 - Java, Spark and Cassandra java.lang.ClassCastException: com.datastax.driver.core.DefaultResultSetFuture cannot be cast to shade NoClassDefFoundError:spark-cassandra-connector中的org / apache / spark / sql / DataFrame - NoClassDefFoundError: org/apache/spark/sql/DataFrame in spark-cassandra-connector Spark 应用程序中的错误 java.lang.NoClassDefFoundError - Error java.lang.NoClassDefFoundError in Spark application Spark:java.lang.NoClassDefFoundError:com / mongodb / hadoop / MongoInputFormat - Spark : java.lang.NoClassDefFoundError: com/mongodb/hadoop/MongoInputFormat spark-cassandra-connector - repartitionByCassandraReplica 返回空 RDD - Java - spark-cassandra-connector - repartitionByCassandraReplica returns empty RDD - Java 使用DatasTax Spark-Cassandra Java连接器运行Spark和Cassandra时出错 - Error running spark and cassandra using datastax spark-cassandra java connector Spark和Cassandra Java应用程序:线程“ main”中的异常java.lang.NoClassDefFoundError:org / apache / spark / sql / Dataset - Spark and Cassandra Java application: Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/spark/sql/Dataset Spark Java:java.lang.NoClassDefFoundError - Spark Java: java.lang.NoClassDefFoundError Apache Spark-java.lang.NoClassDefFoundError - Apache spark - java.lang.NoClassDefFoundError
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM