简体   繁体   中英

No TypeTag available for a case class using scala 3 with spark 3

I have my code that runs a spark job with scala 3

@main def startDatasetJob(): Unit =
  val spark = SparkSession.builder()
    .appName("Datasets")
    .master("local[*]")
    .getOrCreate()

  case class CarRow(Name: String,
                 Miles_per_Gallon: Double,
                 Cylinders: Long,
                 Displacement: Double,
                 Horsepower: Long,
                 Weight_in_lbs: Long,
                 Acceleration: Double,
                 Year: Date,
                 Origin: String
                )

  implicit val carEncoder: Encoder[CarRow] = Encoders.product[CarRow]
  val carsDF = spark.read
    .format("json")
    .option("inferSchema", "true")
    .load("src/main/resources/data/cars.json")

  val carDS = carsDF.as[CarRow]

but get the error message

No TypeTag available for CarRow
  implicit val carEncoder: Encoder[CarRow] = Encoders.product[CarRow]

get a bit confused why the compiler seems unable to load the case class, if any one can help

Spark's encoder derivation relies on Scala 2.x mechanisms, some of them no longer exist.

Have you tried with spark-scala3 ? It provides the encoder derivation for Scala 3.

libraryDependencies += "io.github.vincenzobaz" %% "spark-scala3" % "0.1.3"

Then

import scala3encoders.given

(see the examples in the repo)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM