This is a simplified version of my code that is failing to compile:
object CompileFail {
import org.apache.spark.sql.{ Encoder, SparkSession }
case class Foo(x: Int)
def run(spark: SparkSession): Int => Int = {
import spark.implicits._
add(bar(encode[Foo]))
}
def bar(f: => Int): Int = 0
def encode[A : Encoder]: Int = 0
def add(a: Int)(b: Int) = a + b
}
It is failing with following non sensical message:
[error] Error while emitting CompileFail.scala
[error] value $u
[error] one error found
[error] Compilation failed
I am on Scala 2.12.15
and Spark 3.1.2
(but it fails on older Spark versions too).
Interesting to note:
add(a)(b)
to add(a, b)
it compilesbar(f: => Int)
to bar(f: Int)
it compilesadd(bar(encode[Foo]))
to add(bar(encode[String]))
it compilesWhat am I doing wrong?
I don't think you are doing anything wrong, it's a bug in scala resolution. Only doing this simple change works.
object CompileFail {
import org.apache.spark.sql.{ Encoder, SparkSession }
case class Foo(x: Int)
def run(spark: SparkSession): Int => Int = {
import spark.implicits._
val i = encode[Foo]
add(bar(i))
}
def bar(f: => Int): Int = 0
def encode[A : Encoder]: Int = 0
def add(a: Int)(b: Int) = a + b
}
It looks like it has conflicts with the implicit resolution and the lambda conversion of the function.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.