[英]JdbcIO.read is not returning results in apache beam
I am trying to read some data using jdbIO.read in apache beam and it works fine if I have code as follows.我正在尝试使用 apache 光束中的 jdbIO.read 读取一些数据,如果我有如下代码,它可以正常工作。
Pipeline p = createPipeline(options);
p.apply(JdbcIO.<TestRow>read()
.withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create(dataSource))
.withQuery("query ")
.withCoder(SerializableCoder.of(TestRow.class))
.withRowMapper(new JdbcIO.RowMapper<TestRow>() {
@Override
public TestRow mapRow(ResultSet resultSet) throws Exception {
TestRow testRow = new TestRow();
//setters
return testRow;
}
}))
.apply(MapElements.via(new SimpleFunction<TestRow, String>() {
@Override
public String apply(TestRow input) {
return input.toString();
}
}));
But not getting any results, when I refactor this in a way to remove anonymous functions and put that call in separate class and extending DoFn class.但是没有得到任何结果,当我以删除匿名函数的方式重构它并将该调用放在单独的类中并扩展 DoFn 类时。 the row mapper block is not getting executing at all.行映射器块根本没有执行。
PCollection<String> t = p
.apply(Create.<Long>of(1L))
.apply("Read Data", ParDo.of(readInput))
public abstract class ReadInput<S, T> extends DoFn<Long, TestRow> {
@DoFn.ProcessElement
public void processElement(@Element Long seq, final OutputReceiver<TestRow> receiver) {
getInput(receiver);
public class ReadInputOtc extends ReadInput<Long, TestRow>
@Override
protected void getInput(OutputReceiver<TestRow> receiver) {
JdbcIO.<TestRow>read()
.withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create(this.dataSource))
.withCoder(SerializableCoder.of(TestRow.class))
.withQuery("query ")
.withRowMapper(new JdbcIO.RowMapper<TestRow>() {
public TestRow mapRow(ResultSet resultSet) throws Exception {
TestRow testRow = new TestRow();
//setters
while (resultSet.next()) {
System.out.println(resultSet.getString("id"));
}
receiver.output(testRow);
return testRow;
}
});
}
thanks for your help感谢您的帮助
JdbcIO.<TestRow>read()
just creates a reading PTransform , it does not actually do any reading. JdbcIO.<TestRow>read()
只是创建一个读取PTransform ,它实际上并没有做任何读取。 To do the read, it must be applied to the pipeline object (as you have in your first example) which produces a PCollection of records.要进行读取,它必须应用于生成记录的 PCollection 的管道对象(如您在第一个示例中所做的那样)。 PTransforms are not meant to be used within a DoFn, DoFns act on individual elements, not PCollections of elements. PTransforms 不打算在 DoFn 中使用,DoFns 作用于单个元素,而不是元素的 PCollections。
If you are trying to remove anonomous classes, you could write your code as follows如果你想删除匿名类,你可以写你的代码如下
[public static] class MuRowMapper extends JdbcIO.RowMapper<TestRow> {
@Override
public TestRow mapRow(ResultSet resultSet) throws Exception {
TestRow testRow = new TestRow();
...
return testRow;
}
}
[public static] class MyDoFn extends DoFn<MyRow, String> {
@DoFn.ProcessElement
public void processElement(@Element TestRow testRow,
final OutputReceiver<String> receiver) {
return receiver.output(testRow.toString());
}
}
Pipeline p = createPipeline(options);
p
.apply(JdbcIO.<TestRow>read()
.withDataSourceConfiguration(
JdbcIO.DataSourceConfiguration.create(dataSource))
.withQuery("query ")
.withCoder(SerializableCoder.of(TestRow.class))
.withRowMapper(new MyRowMapper()))
.apply(ParDo.of(new MyDoFn()));
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.