[英]No implicits found for parameter evidence
I have a line of code in a scala app that takes a dataframe with one column and two rows, and assigns them to variables start
and end
:我在 scala 应用程序中有一行代码,该应用程序采用具有一列和两行的 dataframe ,并将它们分配给变量
start
和end
:
val Array(start, end) = datesInt.map(_.getInt(0)).collect()
This code works fine when run in a REPL, but when I try to put the same line in a scala object in Intellij, it inserts a grey (?: Encoder[Int])
before the .collect()
statement, and show an inline error No implicits found for parameter evidence$6: Encoder[Int]
此代码在 REPL 中运行时工作正常,但是当我尝试在 Intellij 中的 scala object 中放入同一行时,它会在
.collect()
语句之前插入一个灰色(?: Encoder[Int])
,并显示一个内联错误No implicits found for parameter evidence$6: Encoder[Int]
I'm pretty new to scala and I'm not sure how to resolve this.我对 scala 很陌生,我不知道如何解决这个问题。
Spark needs to know how to serialize JVM types to send them from workers to the master. Spark 需要知道如何序列化 JVM 类型以将它们从工作人员发送到主机。 In some cases they can be automatically generated and for some types there are explicit implementations written by Spark devs.
在某些情况下,它们可以自动生成,并且对于某些类型,Spark 开发人员会编写显式实现。 In this case you can implicitly pass them.
在这种情况下,您可以隐式传递它们。 If your
SparkSession
is named spark
then you miss following line:如果您的
SparkSession
被命名为spark
,那么您会错过以下行:
import spark.implicits._
As you are new to Scala: implicits are parameters that you don't have to explicitly pass.由于您是 Scala 的新手:implicits 是您不必显式传递的参数。 In your example
map
function requires Encoder[Int]
.在您的示例
map
function 需要Encoder[Int]
。 By adding this import, it is going to be included in the scope and thus passed automatically to map
function.通过添加此导入,它将包含在 scope 中,从而自动传递给
map
function。
Check Scala documentation to learn more.查看Scala 文档以了解更多信息。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.