![](/img/trans.png)
[英]spark - scala: not a member of org.apache.spark.sql.Row
[英]How to construct a function that can be used for mapping a JavaRDD[org.apache.spark.sql.Row] in spark/scala?
val drdd = Seq(("a", 1), ("b", 2), ("a", 3)).toDF("name", "value").toJavaRDD
drdd.map{ (row: Row) => row.get(0) }
看来我传递的匿名函数是Row =>任何期望org.apache.spark.api.java.function.Function [org.apache.spark.sql.Row ,?]
<console>:35: error: type mismatch;
found : org.apache.spark.sql.Row => Any
required: org.apache.spark.api.java.function.Function[org.apache.spark.sql.Row,?]
drdd.map{ (row: Row) => row.get(0) }
^
这些函数类型之间有什么区别,我应该如何构造它? 谢谢!
例:
drdd.map(new org.apache.spark.api.java.function.Function[Row, String]() {
override def call(row: Row): String = row.getString(0)
})
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.