简体   繁体   中英

Detect a when a client disconnects from the server

How to detect when a client disconnects from a server that is using ServerSocket and Socket?

I have an Object I/O Stream. When the client disconnects, I want to server to decrease the currently running connections and show a message. This must happen at more or less the exact time of disconnecting. How do you get this?

If you try to read or write something then you will get to know. Detect this by performing a read(), which will either return -1 (EOF) or raise a SocketException ("connection reset").

In this case, you can detect the end-of-stream when the corresponding read operation returns -1 (for raw read() calls), or null (for readLine() calls), or throw EOFException (for all other readXXX() calls).

Socket methods will cause a SocketException: socket closed when performed on a closed socket. This indicates a programming error on your part.

There is one more solution to this problem is to exchange heartbeat periodically between two connected entity, most of the telecommunication system uses this approach to respond disconnection.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM