I've created a Pipeline to save Google Cloud Pubsub messages into text files using Apache Beam and Java. Whenever I run the pipeline within Google Dataflow with --runner=DataflowRunner
the messages are saved correctly.
However, when I run the same pipeline with --runner=DirerctRunner
the messages are not saved.
I can watch the events coming through the pipeline, but nothing happens.
The pipeline is the code below:
public static void main(String[] args) {
ExerciseOptions options = PipelineOptionsFactory.fromArgs(args).withValidation().as(ExerciseOptions.class);
Pipeline pipeline = Pipeline.create(options);
pipeline
.apply("Read Messages from Pubsub",
PubsubIO
.readStrings()
.fromTopic(options.getTopicName()))
.apply("Set event timestamp", ParDo.of(new DoFn<String, String>() {
@ProcessElement
public void processElement(ProcessContext context) {
context.outputWithTimestamp(context.element(), Instant.now());
}
}))
.apply("Windowing", Window.into(FixedWindows.of(Duration.standardMinutes(5))))
.apply("Write to File",
TextIO
.write()
.withWindowedWrites()
.withNumShards(1)
.to(options.getOutputPrefix()));
pipeline.run();
}
What I'm doing wrong? Is it possible to run this pipeline locally?
I was facing same problem as yours, while testing pipeline. PubSubIO
not working correctly with DirectRunner
and TextIO
.
I found some kind of workaround for this issue with triggering.
.apply(
"2 minutes window",
Window
.configure()
.triggering(
Repeatedly.forever(
AfterFirst.of(
AfterPane.elementCountAtLeast(10),
AfterProcessingTime
.pastFirstElementInPane()
.plusDelayOf(Duration.standardMinutes(2))
)
)
)
.into(
FixedWindows.of(
Duration.standardMinutes(2)
)
)
)
This way files are written as it should. Hope this will help someone.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.