简体   繁体   中英

JAVA : When to use Socket setSoTimeout?

I am making an application where the client sends a message to the server and then waits for 5 seconds (lets assume) for the server to respond and if there is no return message, it retries again. If the server responds with the message then the client process it. This goes on in loop and again happens after sometime.

For this purpose I was thinking to use setSoTimeout(time) on the Client Socket but after reading the javadoc and a lot of explanations on the internet I am confused as to whether this approach is right.

What I read on the internet

(1) If I use setSoTimeout on the socket then it gives the timeout for the duration in which the connection needs to be established and if it is not established then it retries to establish the connection for the given time.

(2) If I use setSoTimeout on the socket then it waits for incoming messages for the specified time interval and if no message is received then it stops waiting.

My questions are -

(1) Which of the above are true ?

(2) If the second statement is true, then can I use it for my implementation ?

(3) If the second statement is true, when does the timeout timer kickoff exactly ? Is it when I declare the socket and set the timeout period on it or is it when I send the message ?

If either of the explanation don't apply to my case then what is it that I should do to wait for a fixed interval of time on the client side for the server to reply ? If the reply does come I should process it and move on and redo the same process. If the reply doesn't come I should move ahead and redo the whole process again.

(1) If I use setSoTimeout() on the socket then it gives the timeout for the duration in which the connection needs to be established and if it is not established then it retries to establish the connection for the given time.

This is incorrect. setSoTimeout() does not cause re-establishment of the connection at all, let alone 'for the given time'.

(2) If I use setSoTimeout() on the socket then it waits for incoming messages for the specified time interval and if no message is received then it stops waiting.

This is slightly more accurate, but there is no such thing as a message in TCP.

The correct explanation is that it blocks for up to the specified timeout for at least one byte to arrive. If nothing arrives within the timeout, a SocketTimeoutException is thrown.

(1) Which of the above are true?

Neither.

(2) If the second statement is true, then can I use it for my implementation?

It isn't, so the second part doesn't apply, but if any statement is true you can use it as part of your implementation. You don't have to ask.

(3) If the second statement is true, when does the timeout timer kickoff exactly?

When you call read() .

Is it when I declare the socket and set the timeout period on it or is it when I send the message?

Neither.

If either of the explanation don't apply to my case then what is it that I should do to wait for a fixed interval of time on the client side for the server to reply?

Set a read timeout.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM