I am trying to read some data using jdbIO.read in apache beam and it works fine if I have code as follows.
Pipeline p = createPipeline(options);
p.apply(JdbcIO.<TestRow>read()
.withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create(dataSource))
.withQuery("query ")
.withCoder(SerializableCoder.of(TestRow.class))
.withRowMapper(new JdbcIO.RowMapper<TestRow>() {
@Override
public TestRow mapRow(ResultSet resultSet) throws Exception {
TestRow testRow = new TestRow();
//setters
return testRow;
}
}))
.apply(MapElements.via(new SimpleFunction<TestRow, String>() {
@Override
public String apply(TestRow input) {
return input.toString();
}
}));
But not getting any results, when I refactor this in a way to remove anonymous functions and put that call in separate class and extending DoFn class. the row mapper block is not getting executing at all.
PCollection<String> t = p
.apply(Create.<Long>of(1L))
.apply("Read Data", ParDo.of(readInput))
public abstract class ReadInput<S, T> extends DoFn<Long, TestRow> {
@DoFn.ProcessElement
public void processElement(@Element Long seq, final OutputReceiver<TestRow> receiver) {
getInput(receiver);
public class ReadInputOtc extends ReadInput<Long, TestRow>
@Override
protected void getInput(OutputReceiver<TestRow> receiver) {
JdbcIO.<TestRow>read()
.withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create(this.dataSource))
.withCoder(SerializableCoder.of(TestRow.class))
.withQuery("query ")
.withRowMapper(new JdbcIO.RowMapper<TestRow>() {
public TestRow mapRow(ResultSet resultSet) throws Exception {
TestRow testRow = new TestRow();
//setters
while (resultSet.next()) {
System.out.println(resultSet.getString("id"));
}
receiver.output(testRow);
return testRow;
}
});
}
thanks for your help
JdbcIO.<TestRow>read()
just creates a reading PTransform , it does not actually do any reading. To do the read, it must be applied to the pipeline object (as you have in your first example) which produces a PCollection of records. PTransforms are not meant to be used within a DoFn, DoFns act on individual elements, not PCollections of elements.
If you are trying to remove anonomous classes, you could write your code as follows
[public static] class MuRowMapper extends JdbcIO.RowMapper<TestRow> {
@Override
public TestRow mapRow(ResultSet resultSet) throws Exception {
TestRow testRow = new TestRow();
...
return testRow;
}
}
[public static] class MyDoFn extends DoFn<MyRow, String> {
@DoFn.ProcessElement
public void processElement(@Element TestRow testRow,
final OutputReceiver<String> receiver) {
return receiver.output(testRow.toString());
}
}
Pipeline p = createPipeline(options);
p
.apply(JdbcIO.<TestRow>read()
.withDataSourceConfiguration(
JdbcIO.DataSourceConfiguration.create(dataSource))
.withQuery("query ")
.withCoder(SerializableCoder.of(TestRow.class))
.withRowMapper(new MyRowMapper()))
.apply(ParDo.of(new MyDoFn()));
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.