[英]Error calling updateStateByKey in Spark Streaming
I have this generic method in Scala 我在Scala中有这种通用方法
def updateStateByKey[S](updateFunc: JFunction2[JList[V], Optional[S],
Optional[S]]) : JavaPairDStream[K, S] = { ... }
When I call it in Java, both of these does not compile: 当我用Java调用它时,这两个都不会编译:
JavaPairDStream<String, Integer> stateDstream =
pairs.<Integer>updateStateByKey(...);
JavaPairDStream<String, Integer> stateDstream =
pairs.updateStateByKey(...);
How do I invoke the method correctly? 如何正确调用该方法?
Error messages: 错误讯息:
The method updateStateByKey(Function2<List<Integer>,Optional<S>,Optional<S>>,
int) in the type JavaPairDStream<String,Integer> is not applicable for
the arguments
(Function2<List<Integer>,Optional<Integer>,Optional<Integer>>,
HashPartitioner, JavaPairRDD<String,Integer>)
Edited: The whole function call (Java 8): 编辑:整个函数调用(Java 8):
final Function2<List<Integer>, Optional<Integer>, Optional<Integer>> updateFunction =
(values, state) -> {
Integer newSum = state.or(0);
for (Integer value : values) {
newSum += value;
}
return Optional.of(newSum);
};
JavaPairDStream<String, Integer> stateDstream = pairs.updateStateByKey(
updateFunction
,
new HashPartitioner(context.defaultParallelism()), initialRDD);
Edited: It turned out that generics is not the issue, but the parameters do not match the method signature. 编辑:事实不是泛型的问题,但参数与方法签名不匹配。
The problem is that you are passing in an initialRDD
, while the method updateStateByKey
does not have that as a parameter. 问题是您传入的是
initialRDD
,而方法updateStateByKey
却没有将其作为参数。
The closest signature is: 最接近的签名是:
updateStateByKey[S](updateFunc: Function2[List[V], Optional[S], Optional[S]],
partitioner: Partitioner): JavaPairDStream[K, S]
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.