简体   繁体   English

spring webflux流完成消费者

[英]spring webflux stream Completion Consumer

I have spring webflux stream consumer which calls a REST endpoint and consumes the messages received and save to an RDBMS. 我有spring webflux流使用者,它调用REST端点并使用收到的消息并将其保存到RDBMS。 i am trying to find a way to batch it. 我试图找到一种方法来批量处理它。 I see the subscribe() has an overloaded method which gets called on Completion. 我看到subscribe()有一个重载的方法,可以在完成时调用它。 I am trying to find how to get hold of the data when this completion consumer gets called since i am calling a CompletionConsumer which is of type Runnable and all i am having is the run() method which dont take any parameters. 我试图找到如何在调用此完成消费者时获取数据,因为我正在调用类型为Runnable的CompletionConsumer,而我所拥有的只是不带任何参数的run()方法。

**CLIENT**

       WebClient.create("http://localhost:8080")
                .get()
                .uri("/objects")
                .accept(MediaType.TEXT_EVENT_STREAM)
                .exchange()
                .flatMapMany(clientResponse ->clientResponse.bodyToFlux(MyObject.class))
               .subscribe(null,null,completionProcessorSubscriber);


**COMPLETION SUBSCRIBER**


@Service
public class CompletionProcessorSubscriber implements  Runnable{

    @Autowired
    LegacyDAOImpl dao;

    Logger logger = LoggerFactory.getLogger(CompletionProcessorSubscriber.class);


    public void run() {

        logger.info("\ninside RUNNNNNNNNN\n\n");
// here how to get hold of the data stream ?
    }

Below is the  Documentation from the Flux API

 */
    public final Disposable subscribe(
            @Nullable Consumer<? super T> consumer,
            @Nullable Consumer<? super Throwable> errorConsumer,
            @Nullable Runnable completeConsumer) {
        return subscribe(consumer, errorConsumer, completeConsumer, null);
    }

You should avoid adding to much logic to subscriber methods. 您应该避免为订户方法添加太多逻辑。 Instead, you should utilize the rich set of operators provided by Flux API. 相反,您应该利用Flux API提供的丰富的运算符集。

In this case the operators you need are buffer to collect batches and concatMap to execute batches sequentially. 在这种情况下,您需要的操作员是buffer用于收集批处理,而concatMap用于顺序执行批处理。

In the following example I assume the LegacyDAOImpl is a blocking service whose work should be assigned to an appropriate thread pool. 在下面的示例中,我假定LegacyDAOImpl是一项阻塞服务,其工作应分配给适当的线程池。

public static void main(String[] args) throws InterruptedException
{
    webClient.get()
             .uri("/objects")
             .accept(MediaType.TEXT_EVENT_STREAM)
             .exchange()
             .flatMapMany(clientResponse -> clientResponse.bodyToFlux(MyObject.class))
             .buffer(100) // batch size
             .concatMap(batchOfMyObjects -> Mono.fromRunnable(() -> legacyDAOImpl.saveAll(batchOfMyObjects))
                                                .subscribeOn(Schedulers.elastic())) // blocking IO goes to elastic thread pool
             .subscribe();
}

private static class LegacyDAOImpl
{
    public void saveAll(List<MyObject> myObjects)
    {
        // save here
    }
}

private static class MyObject
{
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM