I am trying to create a dataset from a JDBC source using Scala in Flink, all the docs / other SO questions seem to use Java. I'm having some issues with generic types.
So far I have:
val inputFormat = JDBCInputFormat.buildJDBCInputFormat()
.setDrivername(driver)
.setDBUrl(url)
.setUsername(username)
.setPassword(password)
.setQuery("select col_a,col_b from my_table")
.finish()
env.createInput(inputFormat)
This gives an error:
error: could not find implicit value for evidence parameter of type org.apache.flink.api.common.typeinfo.TypeInformation[?0]
env.createInput(inputFormat)
I also tried
var tuple = ("",0)
inputFormat.nextRecord(tuple)
Which gave the error:
error: type mismatch;
found : (String, Int)
required: ?0
And finally I tried:
inputFormat.nextRecord(_)
Which resulted in:
found : x$1.type (with underlying type ?0)
required: ?0
So the question is how do I set up a JDBC connection in Flink using Scala / where am I wrong?
For fixing the first issue:
error: could not find implicit value for evidence parameter of type org.apache.flink.api.common.typeinfo.TypeInformation[?0]
env.createInput(inputFormat)
you need to add the following import statement
import org.apache.flink.api.scala._
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.