[英]Create DataSet From JDBC Source in Flink using Scala
I am trying to create a dataset from a JDBC source using Scala in Flink, all the docs / other SO questions seem to use Java.我正在尝试使用 Flink 中的 Scala 从 JDBC 源创建数据集,所有文档/其他 SO 问题似乎都使用 Java。 I'm having some issues with generic types.我在泛型类型方面遇到了一些问题。
So far I have:到目前为止,我有:
val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
.setDrivername(driver)
.setDBUrl(url)
.setUsername(username)
.setPassword(password)
.setQuery("select col_a,col_b from my_table")
.finish()
env.createInput(inputFormat)
This gives an error:这给出了一个错误:
error: could not find implicit value for evidence parameter of type org.apache.flink.api.common.typeinfo.TypeInformation[?0]
env.createInput(inputFormat)
I also tried我也试过
var tuple = ("",0)
inputFormat.nextRecord(tuple)
Which gave the error:这给出了错误:
error: type mismatch;
found : (String, Int)
required: ?0
And finally I tried:最后我试过:
inputFormat.nextRecord(_)
Which resulted in:这导致:
found : x$1.type (with underlying type ?0)
required: ?0
So the question is how do I set up a JDBC connection in Flink using Scala / where am I wrong?所以问题是如何使用Scala在Flink中设置JDBC连接/我错在哪里?
For fixing the first issue:解决第一个问题:
error: could not find implicit value for evidence parameter of type org.apache.flink.api.common.typeinfo.TypeInformation[?0]
env.createInput(inputFormat)
you need to add the following import statement您需要添加以下导入语句
import org.apache.flink.api.scala._
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.