简体   繁体   中英

one grpc server for both unary and bidi stream with netty

I have a requirement to implement a grpc server in java which is capable of handling both grpc unary and bidirectional streaming .It is possible for the service that uses grpc bidi-streaming to send a large number of messages per second.(maybe 2000 messages per second or more)I have two implementations in mind and kind of confused which one suits my requirement most.

1. Use the same server for both grpc unary and grpc bidirectional.

When using this approach, since both grpc unary and bidi stream uses the same port, one boss thread will be allocated for both unary and bidi stream. So i'm not sure how well it performs in case of bidi streams receiving large number of messages per second. (I mean whether the boss thread will get busy for bidi streams and become unavailable for the unary)

final EventLoopGroup bossGroup = new NioEventLoopGroup(Runtime.getRuntime().availableProcessors());
    final EventLoopGroup workerGroup = new NioEventLoopGroup(Runtime.getRuntime().availableProcessors() * 2);
    int blockingQueueLength = 1000;
    final BlockingQueue blockingQueue = new LinkedBlockingQueue(blockingQueueLength);
    final Executor executor = new ThreadPoolExecutor(400, 500, 30, TimeUnit.SECONDS, blockingQueue);
    Server server = NettyServerBuilder.forPort(PORT).maxConcurrentCallsPerConnection(50)
            .keepAliveTime(60, TimeUnit.SECONDS).bossEventLoopGroup(bossGroup)
            .workerEventLoopGroup(workerGroup).addService(new ExtAuthService()).addService(new RateLimitService())
            .channelType(NioServerSocketChannel.class)
            .executor(executor).build();
            
    try{
        server.start();
        erver.awaitTermination();
    }catch(Exception e){
        Logger.Error("Execption", new Error(e));
    }

2. Use two servers one for grpc unary and one for grpc bidi streaming.

Here the previously mentioned issue is not there since we allocate 2 boss threads one for each grpc unary and bidi stream. But for the services I'm using an executor which uses java ThreadPoolExecutor and my question is should I use 2 threadpools for the two services that use grpc unary and bidi streaming?

final EventLoopGroup bossGroup = new NioEventLoopGroup(Runtime.getRuntime().availableProcessors());
    final EventLoopGroup workerGroup = new NioEventLoopGroup(Runtime.getRuntime().availableProcessors() * 2);
    int blockingQueueLength = 1000;
    final BlockingQueue blockingQueue = new LinkedBlockingQueue(blockingQueueLength);
    final Executor executor = new ThreadPoolExecutor(400, 500, 30, TimeUnit.SECONDS, blockingQueue);

    // I have used here the same executor for both servers. 
    Server server1 = NettyServerBuilder.forPort(PORT_1).maxConcurrentCallsPerConnection(50)
            .keepAliveTime(60, TimeUnit.SECONDS).bossEventLoopGroup(bossGroup)
            .workerEventLoopGroup(workerGroup).addService(new ExtAuthService())
            .channelType(NioServerSocketChannel.class)
            .executor(executor).build();

    Server server2 = NettyServerBuilder.forPort(PORT_2).maxConcurrentCallsPerConnection(50)
            .keepAliveTime(60, TimeUnit.SECONDS).bossEventLoopGroup(bossGroup)
            .workerEventLoopGroup(workerGroup).addService(new RateLimitService())
            .channelType(NioServerSocketChannel.class)
            .executor(executor).build();
            
    try{
        server1.start();
        server2.start();
        server1.awaitTermination();
        server2.awaitTermination();
    }catch(Exception e){
        Logger.Error("Execption", new Error(e));
    }

Use a single server.

The boss thread is only used for accept()ing new connections. It isn't used for the actual processing. That is done by the worker event loops. Each connection is assigned to a single event loop, and a single event loop can service multiple connections.

On a single stream, Netty can process 100k messages per second. But this is actually slow. Finding message boundaries is handled by a different thread than messages are delivered on, and communication between those two threads adds latency. That added latency slows things down. With a requests(5) trick that avoids the latency, a single Netty stream can process 1250k messages per second. (These performance numbers will vary depending on the machine you run them on, but they are clearly much higher than you need.) See https://github.com/grpc/grpc-java/issues/6696 where the latency issue is discussed.

But let's say for a moment you needed higher performance, or wanted to separate the unary traffic from the streaming traffic. In this case, we'd recommend using two different channels . Each channel would use its own connections and (probably) separate worker event loops.

Only if you were very concerned about latency should you bother splitting the two types of traffic into separate servers. (Such that you also have benchmarks showing how much it helps.) And yes, using separate Servers and Channels with their own workerEventLoopGroup() s (on the channel as well, Channels use a shared event loop group by default). with a limited number of threads so each can have its own processor core for processing; But I'd expect it to be a rare situation. you're quickly approaching the point where you want to split the server binary in two to avoid GC and similar interplay between the services.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM