![](/img/trans.png)
[英]java.lang.NoClassDefFoundError: org/apache/spark/sql/DataFrame
[英]spark-cassandra (java.lang.NoClassDefFoundError: org/apache/spark/sql/cassandra/package)
我正在尝试使用 scala 2.12.5 和 sbt 1.6.2 从 cassandra 4.0.3 和 spark 3.2.1 读取 DataFrame,但我有一个问题。
这是我的 sbt 文件:
name := "StreamHandler"
version := "1.6.2"
scalaVersion := "2.12.15"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "3.2.1" % "provided",
"org.apache.spark" %% "spark-sql" % "3.2.1" % "provided",
"org.apache.cassandra" % "cassandra-all" % "4.0.3" % "test",
"org.apache.spark" %% "spark-streaming" % "3.2.1" % "provided",
"com.datastax.spark" %% "spark-cassandra-connector" % "3.2.0",
"com.datastax.cassandra" % "cassandra-driver-core" % "4.0.0"
)
libraryDependencies += "com.datastax.dse" % "dse-java-driver-core" % "2.1.1" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "3.2.1" % "provided"
libraryDependencies += "org.apache.commons" % "commons-math3" % "3.6.1" % "provided"
这是我的 scala 文件:
import org.apache.spark.sql._
import org.apache.spark.sql.functions._
import org.apache.spark.sql.streaming._
import org.apache.spark.sql.types._
import org.apache.spark.sql.cassandra._
import com.datastax.oss.driver.api.core.uuid.Uuids
import com.datastax.spark.connector._
object StreamHandler {
def main(args: Array[String]){
val spark = SparkSession
.builder
.appName("Stream Handler")
.config("spark.cassandra.connection.host","localhost")
.getOrCreate()
import spark.implicits._
val Temp_DF = spark
.read
.cassandraFormat("train_temperature", "project")
.load()
Temp_DF.show(10)
}
}
这是结果:
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.