简体   繁体   中英

NetworkStream.BeginRead when client connection is broken

Ive got a TCP client stream that is being read via async call NetworkStream.BeginRead (passing an async call back delegate). The problem is to detect when the connection is down. Currently if the connection is cut the BeginRead call just disappears into the ether - the call back is just not called. If the app try a send on the stream while its down this does trigger the callback and NetworkStream.EndRead throws an exception - this is OK - but if no send is issued then app just sits in the dark unware that connection is down.

Initially I saw that NetworkStream.ReadTimout was not set (ie was default Timout.Infinite) - but setting this (to, say, 3000ms) didn't help. [Edit: MSDN doc clearly states that ReadTimeout only applys to the syncronous Read call not the asyn BeginRead - I should have checked that more carefully earlier :-( ]

How to detect that client connection has failed?

Do I have to poll the underlying socket as shown in this SO question ?

In addition: When the connection is physically re-established the callback still doesn't get called - we just sit waiting in the ether until we try a send.

I ended up having to change from a TcpClient async BeginRead to a syncronous Read with a timeout on a dedicated read Thead.

This enabled me to raise an event when the timeout was pulled due to no data on the stream.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM