[英]java.lang.NoClassDefFoundError: org/apache/spark/sql/DataFrame
[英]spark-cassandra (java.lang.NoClassDefFoundError: org/apache/spark/sql/cassandra/package)
我正在嘗試使用 scala 2.12.5 和 sbt 1.6.2 從 cassandra 4.0.3 和 spark 3.2.1 讀取 DataFrame,但我有一個問題。
這是我的 sbt 文件:
name := "StreamHandler"
version := "1.6.2"
scalaVersion := "2.12.15"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "3.2.1" % "provided",
"org.apache.spark" %% "spark-sql" % "3.2.1" % "provided",
"org.apache.cassandra" % "cassandra-all" % "4.0.3" % "test",
"org.apache.spark" %% "spark-streaming" % "3.2.1" % "provided",
"com.datastax.spark" %% "spark-cassandra-connector" % "3.2.0",
"com.datastax.cassandra" % "cassandra-driver-core" % "4.0.0"
)
libraryDependencies += "com.datastax.dse" % "dse-java-driver-core" % "2.1.1" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "3.2.1" % "provided"
libraryDependencies += "org.apache.commons" % "commons-math3" % "3.6.1" % "provided"
這是我的 scala 文件:
import org.apache.spark.sql._
import org.apache.spark.sql.functions._
import org.apache.spark.sql.streaming._
import org.apache.spark.sql.types._
import org.apache.spark.sql.cassandra._
import com.datastax.oss.driver.api.core.uuid.Uuids
import com.datastax.spark.connector._
object StreamHandler {
def main(args: Array[String]){
val spark = SparkSession
.builder
.appName("Stream Handler")
.config("spark.cassandra.connection.host","localhost")
.getOrCreate()
import spark.implicits._
val Temp_DF = spark
.read
.cassandraFormat("train_temperature", "project")
.load()
Temp_DF.show(10)
}
}
這是結果:
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.