简体   繁体   中英

ServerSocket.accept() on host does not accept connection originating in guest VM

In my Java app, I have the following code:

_serverSocket = new ServerSocket();
_serverSocket.setReuseAddress(true);
_serverSocket.bind(new InetSocketAddress(port));
// _serverSocket = ServerSocket[addr=0.0.0.0/0.0.0.0,localport=33202]
_socket = _serverSocket.accept();
...

In a VM, I launch a (black box) client application that tries to connect to the above server. For an unknown reason, no connection is established and accept never returns.

I know the client app is trying to connect, as I see appropriate packets in Wireshark: Wireshark 屏幕截图

I can't launch the same client application on my host machine but I can successfully establish a TCP connection from the host with nc , so I know the server is actually listening. The VM and host are in the 10.11.1.0 subnet. The host has IP 10.11.1.1, the VM 10.11.1.68. They can ping each other.

But still, for some reason, the TCP packets do not seem to arrive at my Java code. Any ideas why or how I could further debug this?

I could solve my problem. It was indeed a problem with the host's Windows Firewall settings. After adding an inbound rule for javaw.exe of the specific JDK I was using to run the server, the connection could be established successfully.

Shout out to Corporate IT that didn't just let me disable the firewall quickly to determine if that was the cause of the problem. :\\

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM