简体   繁体   English

如何将助焊剂链接到另一个助焊剂/单声道并施加另一个背压?

[英]How to chain flux to another flux/mono and apply another back pressure?

I have below reactive code using flux in reactor core: 我在反应堆堆芯中使用焊剂的反应性代码如下:

Flux.create(sink -> ... /* listens to and receives from external source */ , FluxSink.OverflowStrategy.LATEST)
    .flatMap(map -> redisHashReactiveCommands.hmset(key, map))
    //.flatMap(... //want to store same data async into kafka with its own back pressure handling)
    .subscribeOn(Schedulers.parallel())
    .doOnNext(s -> log.debug("Redis consumed. Result -> {}", s))
    .doOnComplete(() -> log.debug("On completed."))
    .doOnError(exception -> log.error("Error occurred while consuming message", exception))
    .subscribe();

As you can see, I have back pressure handling on this for external source to my process (FluxSink.OverflowStrategy.LATEST). 正如您所看到的,我对此流程的外部源(FluxSink.OverflowStrategy.LATEST)进行了背压处理。 However, I also want to configure back pressure for my process to redis (redisHashReactiveCommands.hmset(key, map)) since it can be a bigger bottleneck than external source to my process. 但是,我还想为我的进程配置反向压力redis(redisHashReactiveCommands.hmset(key,map)),因为它可能比我的进程的外部源更大的瓶颈。 I expect I'd need to create another flux for redis part and link it with this flux, but how do I achieve this since .flatMap works on individual item and not a stream of items? 我希望我需要为redis部分创建另一个通量并将其与此通量链接,但是我如何实现这一点,因为.flatMap适用于单个项而不是项目流?

Also, I want to store the same emitted item into Kafka as well but chaining flapMap's doesn't seem to work.. is there an easy way to link all these together in one set of functional calls (external source -> my process, my process -> redis, my process -> kafka)? 此外,我想将相同的发射项存储到Kafka中,但链接的flapMap似乎不起作用..有一种简单的方法可以将所有这些链接在一组函数调用中(外部源 - >我的进程,我的进程 - > redis,我的进程 - > kafka)?

If you're not interested in the result objects in the main sequence you could combine both saves from within the flatMap . 如果您对主序列中的结果对象不感兴趣,则可以在flatMap组合两个保存。 You'd have to move the subscribeOn and log inside the flatMap as well to put them on the inner save publishers: 您必须移动subscribeOn并在flatMap内部登录,并将它们放在内部保存发布者上:

Flux.create(sink -> ... /* listens to and receives from external source */ , FluxSink.OverflowStrategy.LATEST)
    .flatMap(map -> Mono.when(
        redisHashReactiveCommands.hmset(key, map)
            .subscribeOn(Schedulers.parallel())
            .doOnNext(s -> log.debug("Redis consumed. Result -> {}", s)),

        kafkaReactiveCommand.something(map)
            .subscribeOn(Schedulers.parallel())
            .doOnNext(s -> log.debug("Kafka consumed. Result -> {}", s)),
    ))
    //... this results in a Mono<Void>
    .doOnComplete(() -> log.debug("Both redis and kafka completed."))
    .doOnError(exception -> log.error("Error occurred while consuming message", exception))
    .subscribe();

Alternatively, if you're sure both processes emit either a result element or an error, you can combine both results into a Tuple2 by replacing when with zip . 或者,如果您确定两个进程都发出结果元素或错误,您可以通过替换zip when将两个结果合并到Tuple2

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM