[英]Is it safe for a Flink application to have multiple data/key streams in s job all sharing the same Kafka source and sink?
像 DAG 一样的fan-out -> fan-in
。
var fanoutStreamOne = new StreamComponents(/*filter, flatmap, etc*/);
var fanoutStreamTwo = new StreamComponents(/*filter, flatmap, etc*/);
var fanoutStreamThree = new StreamComponents(/*filter, flatmap, etc*/);
var fanoutStreams = Set.of(fanoutStreamOne, fanoutStreamTwo, fanoutStreamThree)
var source = new FlinkKafkaConsumer<>(...);
var sink = new FlinkKafkaProducer<>(...);
// creates streams from same source to same sink (Using union())
new streamingJob(source, sink, fanoutStreams).execute();
我只是好奇这是否会影响 Flink 应用程序的恢复/检查点或性能。
有没有人在这个实现上取得了成功?
我应该在过滤之前预先设置水印策略吗?
预先感谢!
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.