简体   繁体   English

Netty TCP客户端异步消息

[英]Netty TCP Client async messages

I am building a tcp client to receive and sending messages. 我正在建立一个TCP客户端来接收和发送消息。 I followed the steps on Netty user guide and wrote a simple tcp client with a custom handler extending the ChannelInboundHandlerAdapter . 我按照Netty用户指南上的步骤进行操作,并编写了一个带有自定义处理程序的简单tcp客户端,该处理程序扩展了ChannelInboundHandlerAdapter

In the hander I store the ChannelHandlerContext : 在处理程序中,我存储ChannelHandlerContext

 @Override
 public void channelActive (ChannelHandlerContext ctx) throws Exception {
   super.channelActive (ctx);
   this.ctx = ctx;
 }

Then I have a send method which uses the ChannelHandlerContext to send messages: 然后,我有一个使用ChannelHandlerContext发送消息的send方法:

 public void sendMessage (String msg) {
  if (ctx == null) {
    return;
  }
  ChannelFuture cf = ctx.write (Unpooled.copiedBuffer (msg, CharsetUtil.UTF_8));
  ctx.flush ();
}

The other option I have found is to use the Channel object in your client class 我发现的另一个选择是在客户端类中使用Channel对象

 channel.writeAndFlush (msg);

I need to call the send method from a different thread. 我需要从其他线程调用send方法。 What is the best way to do it ? 最好的方法是什么?

Thanks in advance. 提前致谢。

Both ChannelHandlerContext and Channel are thread safe, so you can write from any thread without worrying. ChannelHandlerContextChannel都是线程安全的,因此您可以从任何线程进行写而不必担心。

If you use Channel.write() , message will have to travel through the complete pipeline. 如果使用Channel.write() ,则消息将必须通过整个管道。 But if you use ChannelHandlerContext.write() it will only have to travel through the upstream handlers in the pipeline. 但是,如果您使用ChannelHandlerContext.write() ,则只需要在管道中的上游处理程序中移动即可。 Therefore writing to ChannelHandlerContext is more efficient. 因此,写入ChannelHandlerContext更有效。

And also note that most of the time its better to use writeAndFlush() instead of write() . 还要注意,大多数时候最好使用writeAndFlush()而不是write()

Have a look at this presentation for more details. 请查看此演示文稿以获取更多详细信息。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM