简体   繁体   中英

Netty chunked input stream

I have seen lots of questions around about chunked streams in netty, but most of them were solutions about outbound streams, not inbound streams.

I would like to understand how can I get the data from the channel and send it as an InputStream to my business logic without loading all the data in memory first. Here's what I was trying to do:

public class ServerRequestHandler extends MessageToMessageDecoder<HttpObject> {

  private HttpServletRequest request;
  private PipedOutputStream os;
  private PipedInputStream is;

  @Override
  public void handlerAdded(ChannelHandlerContext ctx) throws Exception {
    super.handlerAdded(ctx);
    this.os = new PipedOutputStream();
    this.is = new PipedInputStream(os);
  }

  @Override
  public void handlerRemoved(ChannelHandlerContext ctx) throws Exception {
    super.handlerRemoved(ctx);
    this.os.close();
    this.is.close();
  }

  @Override
  protected void decode(ChannelHandlerContext ctx, HttpObject msg, List<Object> out)
      throws Exception {
    if (msg instanceof HttpRequest) {
      this.request = new CustomHttpRequest((HttpRequest) msg, this.is);
      out.add(this.request);
    }
    if (msg instanceof HttpContent) {
      ByteBuf body = ((HttpContent) msg).content();

      if (body.readableBytes() > 0)
        body.readBytes(os, body.readableBytes());

      if (msg instanceof LastHttpContent) {
        os.close();
      }
    }

  }

}

And then I have another Handler that will get my CustomHttpRequest and send to what I call a ServiceHandler, where my business logic will read from the InputStream.

public class ServiceRouterHandler extends SimpleChannelInboundHandler<CustomHttpRequest> {
...
    @Override
    public void channelRead0(ChannelHandlerContext ctx, CustomHttpRequest request) throws IOException {
...
        future = serviceHandler.handle(request, response);
...

This does not work because when my Handler forwards the CustomHttpRequest to the ServiceHandler, and it tries to read from the InputStream, the thread is blocking, and the HttpContent is never handled in my Decoder.

I know I can try to create a separate thread for my Business Logic, but I have the impression I am overcomplicating things here.
I looked at ByteBufInputStream, but it says that

Please note that it only reads up to the number of readable bytes determined at the moment of construction.

So I don't think it will work for Chunked Http requests. Also, I saw ChunkedWriteHandler, which seems fine for Oubound chunks, but I couldn't find something as ChunkedReadHandler...

So my question is: what's the best way to do this? My requirementes are:

- Do not keep data in memory before sending the ServiceHandlers;
- The ServiceHandlers API should be netty agnostic (that's why I use my CustomHttpRequest, instead of Netty's HttpRequest);

UPDATE I have got this to work using a more reactive approach on the CustomHttpRequest. Now, the request does not provide an InputStream to the ServiceHandlers so they can read (which was blocking), but instead, the CustomHttpRequest now has a readInto(OutputStream) method that returns a Future, and all the service handler will just be executed when this Outputstream is fullfilled. Here is how it looks like

public class CustomHttpRequest {
  ...constructors and other methods hidden...
  private final SettableFuture<Void> writeCompleteFuture = SettableFuture.create();

  private final SettableFuture<OutputStream> outputStreamFuture = SettableFuture.create();

  private ListenableFuture<Void> lastWriteFuture = Futures.transform(outputStreamFuture, x-> null);

  public ListenableFuture<Void> readInto(OutputStream os) throws IOException {
    outputStreamFuture.set(os);
    return this.writeCompleteFuture;
  }

  ListenableFuture<Void> writeChunk(byte[] buf) {
    this.lastWriteFuture = Futures.transform(lastWriteFuture, (AsyncFunction<Void, Void>) (os) -> {
      outputStreamFuture.get().write(buf);
      return Futures.immediateFuture(null);
    });
    return lastWriteFuture;
  }


  void complete() {
    ListenableFuture<Void> future =
        Futures.transform(lastWriteFuture, (AsyncFunction<Void, Void>) x -> {
          outputStreamFuture.get().close();
          return Futures.immediateFuture(null);
        });
    addFinallyCallback(future, () -> {
      this.writeCompleteFuture.set(null);
    });

  }
}

And my updated ServletRequestHandler looks like this:

public class ServerRequestHandler extends MessageToMessageDecoder<HttpObject> {

  private NettyHttpServletRequestAdaptor request;

  @Override
  public void handlerAdded(ChannelHandlerContext ctx) throws Exception {
    super.handlerAdded(ctx);
  }

  @Override
  public void handlerRemoved(ChannelHandlerContext ctx) throws Exception {
    super.handlerRemoved(ctx);
  }


  @Override
  protected void decode(ChannelHandlerContext ctx, HttpObject msg, List<Object> out)
      throws Exception {
    if (msg instanceof HttpRequest) {
      HttpRequest request = (HttpRequest) msg;

      this.request = new CustomHttpRequest(request, ctx.channel());

      out.add(this.request);
    }
    if (msg instanceof HttpContent) {
      ByteBuf buf = ((HttpContent) msg).content();
      byte[] bytes = new byte[buf.readableBytes()];
      buf.readBytes(bytes);

      this.request.writeChunk(bytes);

      if (msg instanceof LastHttpContent) {
        this.request.complete();
      }
    }
  }
}

This works pretty well, but still, note that everything here is done in a single thread, and maybe for large data I might want to spawn a new thread to release that thread for other channels.

You're on the right track - if your serviceHandler.handle(request, response); call is doing a blocking read, you need to create a new thread for it. Remember, there are supposed to be only a small number of Netty worker threads, so you shouldn't do any blocking calls in worker threads.

The other question to ask is, does your service handler need to be blocking? What does it do? If it's shoveling the data over the network anyway, can you incorporate it into the Netty pipeline in a non-blocking way? That way, everything is async all the way, no blocking calls and extra threads required.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM