[英]Value lookup is not a member of org.apache.spark.rdd.RDD
This is my whole sample code: 这是我的整个示例代码:
package trouble.something
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.rdd.RDD
object Stack {
val conf = new SparkConf().setMaster("local[*]").setAppName("app")
val sc = new SparkContext(conf)
def ExFunc[Int](looku: RDD[(Int, Long)]) {
val ke = 3
looku.lookup(ke);
}
def main(args: Array[String]){
val pi: RDD[(Int, Long)] = sc.parallelize(Seq((1, 9L), (2, 11L)))
pi.lookup(3)
val res = ExFunc[Int](pi)
}
}
When I execute the following line, it executes correctly without any errors and produces output 当我执行以下行时,它正确执行且没有任何错误并产生输出
pi.lookup(3)
But, when I pass pi
to a function, and use lookup
as below, then i get an error 但是,当我将
pi
传递给函数,并使用如下所示的lookup
方式时,出现错误
val res = ExFunc[Int](pi)
Passing pi
to below function 将
pi
传递给以下函数
def ExFunc[Int](looku: RDD[(Int, Long)]) {
val ke = 3
looku.lookup(ke);
}
Then I get this error message: 然后我收到此错误消息:
Error:(27, 11) value lookup is not a member of org.apache.spark.rdd.RDD[(Int, Long)]
looku.lookup(ke);
Can anybody help me correcting this error 谁能帮我纠正此错误
The function shouldn't be generic. 该函数不应是通用的。 Just remove type parameter
只需删除类型参数
def ExFunc(looku: RDD[(Int, Long)]) {
val ke = 3
looku.lookup(ke);
}
To create generic function provide ClassTag
, for example 要创建通用函数,请提供
ClassTag
,例如
import scala.reflect.ClassTag
def ExFunc[T : ClassTag](looku: RDD[(T, Long)], ke: T) {
looku.lookup(ke);
}
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.