[英]Scala WindowFunction does not compile
I have been writing a prototype application using Apache Flink. 我一直在使用Apache Flink编写原型应用程序。 In the process, I have chosen to use org.apache.flink.streaming.api.functions.windowing.WindowFunction for a particular Use-Case. 在此过程中,我选择将org.apache.flink.streaming.api.functions.windowing.WindowFunction用于特定的用例。 However, while writing the body of the apply() function, I am facing this error (the code below is not from the application I am writing - my datatypes are different - it is from the sample code available in Flink's documentation site): 但是,在编写apply()函数的主体时,我遇到了此错误(以下代码并非来自我正在编写的应用程序-我的数据类型不同-它来自Flink文档站点中的示例代码):
import scala.collection.Iterable
import scala.collection.Map
import org.apache.flink.streaming.api.functions.windowing.WindowFunction
import org.apache.flink.streaming.api.windowing.windows.{TimeWindow}
import org.apache.flink.util.Collector
import scala.collection.JavaConversions._
class MyWindowFunction extends WindowFunction[(String, Long), String, String, TimeWindow] {
def apply(key: String, window: TimeWindow, input: Iterable[(String, Long)], out: Collector[String]): Unit = {
var count = 0L
for (in <- input) {
count = count + 1
}
out.collect(s"Window $window count: $count")
}
}
The compiler is complaining: 编译器抱怨:
Error:(16, 7) class MyWindowFunction needs to be abstract, since method apply in trait WindowFunction of type
(x$1: String, x$2: org.apache.flink.streaming.api.windowing.windows.TimeWindow,
x$3: Iterable[(String, Long)],
x$4: org.apache.flink.util.Collector[String])Unit is not defined
class MyWindowFunction extends WindowFunction[(String, Long), String, String, TimeWindow] {
I have checked the order of the parameters in apply() ; 我已经检查了apply()中参数的顺序; they seem to be correct. 他们似乎是正确的。
For some reason, I am failing to spot the exact source of the error. 由于某些原因,我无法找出错误的确切来源。 Could someone please nudge me to the solution? 有人可以将我推向解决方案吗?
I have found the cause of this error. 我已找到此错误的原因。
What was not clear to me was the fact that Apache Flink's API expects a java.lang.Iterable, instead of its Scala equivalent: 我不清楚的是,Apache Flink的API需要一个java.lang.Iterable,而不是它的Scala等同物:
class MyWindowFunction extends
WindowFunction[(String, Long), String, String, TimeWindow] {
override
def apply(
key: String,
w: TimeWindow,
iterable: Iterable[(String, Long)], // from java.lang.Iterable
collector: Collector[String]): Unit = {
// ....
}
}
So, I had to import appropriately: 因此,我必须适当地导入:
import java.lang.Iterable // From Java
import java.util.Map // From Java
import org.apache.flink.streaming.api.functions.windowing.WindowFunction
import org.apache.flink.streaming.api.windowing.windows.TimeWindow
import org.apache.flink.util.Collector
import scala.collection.JavaConversions._ // Implicit conversions
class MyWindowFunction
extends WindowFunction[(String, Long), String, String, TimeWindow] {
override
def apply(
key: String,
w: TimeWindow,
iterable: Iterable[(String, Long)],
collector: Collector[String]): Unit = {
// ....
}
}
All was well! 一切都好!
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.