简体   繁体   English

为 Object 上的上下文边界提供隐含证据

[英]Providing implicit evidence for context bounds on Object

I'm trying to write some abstractions in some Spark Scala code, but running into some issues when using objects.我试图在一些 Spark Scala 代码中编写一些抽象,但在使用对象时遇到了一些问题。 I'm using Spark's Encoder which is used to convert case classes to database schema's here as an example, but I think this question goes for any context bound .我在这里使用 Spark 的Encoder ,它用于将案例类转换为数据库模式,但我认为这个问题适用于任何上下文绑定

Here is a minimal code example of what I'm trying to do:这是我正在尝试做的最小代码示例:

package com.sample.myexample

import org.apache.spark.sql.Encoder
import scala.reflect.runtime.universe.TypeTag

case class MySparkSchema(id: String, value: Double)

abstract class MyTrait[T: TypeTag: Encoder]

object MyObject extends MyTrait[MySparkSchema]

Which fails with the following compilation error:失败并出现以下编译错误:

Unable to find encoder for type com.sample.myexample.MySparkSchema. An implicit Encoder[com.sample.myexample.MySparkSchema] is needed to store com.sample.myexample.MySparkSchema instances in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._  Support for serializing other types will be added in future releases.

I tried defining the implicit evidence in the object like such: (the import statement was suggested by IntelliJ, but it looks a bit weird)我尝试像这样定义 object 中的隐式证据:(IntelliJ 建议使用 import 语句,但看起来有点奇怪)

import com.sample.myexample.MyObject.encoder

object MyObject extends MyTrait[MySparkSchema] {
  implicit val encoder: Encoder[MySparkSchema] = Encoders.product[MySparkSchema]
}

Which fails with the error message哪个失败并显示错误消息

MyTrait.scala:13:25: super constructor cannot be passed a self reference unless parameter is declared by-name

One other thing I tried is to convert the object to a class and provide implicit evidence to the constructor:我尝试的另一件事是将 object 转换为 class 并向构造函数提供隐式证据:

class MyObject(implicit evidence: Encoder[MySparkSchema]) extends MyTrait[MySparkSchema]

This compiles and works fine, but at the expense of MyObject now being a class instead.这编译并工作正常,但以MyObject现在为class为代价。

Question: Is it possible to provide implicit evidence for the context bounds when extending a trait?问题:是否可以在扩展特征时为上下文边界提供隐式证据? Or does the implicit evidence force me to make a constructor and use class instead?还是隐含的证据迫使我创建构造函数并改用class

Your first error almost gives you the solution, you have to import spark.implicits._ for Product types.您的第一个错误几乎为您提供了解决方案,您必须为产品类型导入spark.implicits._

You could do this:你可以这样做:

val spark: SparkSession = SparkSession.builder().getOrCreate()
import spark.implicits._

Full Example完整示例

package com.sample.myexample

import org.apache.spark.sql.Encoder
import scala.reflect.runtime.universe.TypeTag

case class MySparkSchema(id: String, value: Double)

abstract class MyTrait[T: TypeTag: Encoder]

val spark: SparkSession = SparkSession.builder().getOrCreate()
import spark.implicits._

object MyObject extends MyTrait[MySparkSchema]

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM