简体   繁体   中英

Providing implicit evidence for context bounds on Object

I'm trying to write some abstractions in some Spark Scala code, but running into some issues when using objects. I'm using Spark's Encoder which is used to convert case classes to database schema's here as an example, but I think this question goes for any context bound .

Here is a minimal code example of what I'm trying to do:

package com.sample.myexample

import org.apache.spark.sql.Encoder
import scala.reflect.runtime.universe.TypeTag

case class MySparkSchema(id: String, value: Double)

abstract class MyTrait[T: TypeTag: Encoder]

object MyObject extends MyTrait[MySparkSchema]

Which fails with the following compilation error:

Unable to find encoder for type com.sample.myexample.MySparkSchema. An implicit Encoder[com.sample.myexample.MySparkSchema] is needed to store com.sample.myexample.MySparkSchema instances in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._  Support for serializing other types will be added in future releases.

I tried defining the implicit evidence in the object like such: (the import statement was suggested by IntelliJ, but it looks a bit weird)

import com.sample.myexample.MyObject.encoder

object MyObject extends MyTrait[MySparkSchema] {
  implicit val encoder: Encoder[MySparkSchema] = Encoders.product[MySparkSchema]
}

Which fails with the error message

MyTrait.scala:13:25: super constructor cannot be passed a self reference unless parameter is declared by-name

One other thing I tried is to convert the object to a class and provide implicit evidence to the constructor:

class MyObject(implicit evidence: Encoder[MySparkSchema]) extends MyTrait[MySparkSchema]

This compiles and works fine, but at the expense of MyObject now being a class instead.

Question: Is it possible to provide implicit evidence for the context bounds when extending a trait? Or does the implicit evidence force me to make a constructor and use class instead?

Your first error almost gives you the solution, you have to import spark.implicits._ for Product types.

You could do this:

val spark: SparkSession = SparkSession.builder().getOrCreate()
import spark.implicits._

Full Example

package com.sample.myexample

import org.apache.spark.sql.Encoder
import scala.reflect.runtime.universe.TypeTag

case class MySparkSchema(id: String, value: Double)

abstract class MyTrait[T: TypeTag: Encoder]

val spark: SparkSession = SparkSession.builder().getOrCreate()
import spark.implicits._

object MyObject extends MyTrait[MySparkSchema]

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM