[英]No TypeTag available for a case class using scala 3 with spark 3
I have my code that runs a spark job with scala 3我的代码使用 scala 3 运行火花作业
@main def startDatasetJob(): Unit =
val spark = SparkSession.builder()
.appName("Datasets")
.master("local[*]")
.getOrCreate()
case class CarRow(Name: String,
Miles_per_Gallon: Double,
Cylinders: Long,
Displacement: Double,
Horsepower: Long,
Weight_in_lbs: Long,
Acceleration: Double,
Year: Date,
Origin: String
)
implicit val carEncoder: Encoder[CarRow] = Encoders.product[CarRow]
val carsDF = spark.read
.format("json")
.option("inferSchema", "true")
.load("src/main/resources/data/cars.json")
val carDS = carsDF.as[CarRow]
but get the error message但收到错误消息
No TypeTag available for CarRow
implicit val carEncoder: Encoder[CarRow] = Encoders.product[CarRow]
get a bit confused why the compiler seems unable to load the case class, if any one can help有点困惑为什么编译器似乎无法加载案例 class,如果有人可以提供帮助
Spark's encoder derivation relies on Scala 2.x mechanisms, some of them no longer exist. Spark 的编码器推导依赖于 Scala 2.x 机制,其中一些已不存在。
Have you tried with spark-scala3 ?你试过spark-scala3吗? It provides the encoder derivation for Scala 3.
它为 Scala 3 提供编码器推导。
libraryDependencies += "io.github.vincenzobaz" %% "spark-scala3" % "0.1.3"
Then然后
import scala3encoders.given
(see the examples in the repo) (请参阅 repo 中的示例)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.