简体   繁体   中英

Java server socket data input stream read timeout

I am having java server and just one client at the time.

  • client connects and sends card ID (blocking read on server side is suitable, because only 1 client at the time)
  • if card doesn't exist in database it just sends back 0 and close the socket (no problem)
  • if card does exist sends back 1
  • now client has to send PIN to the server, but there has to be some timeout, let's say 10s. Here i cannot use blocking read, what should i do? Socket setSoTimeout is not an option, because first read is blocking but second one should not be.

My advice is to create a ExecutorService and start a thread with it.

There is a example here : http://docs.oracle.com/javase/7/docs/api/java/util/concurrent/ExecutorService.html

The correct (but not the easiest) way to to this is to use java.nio.channels.SocketChannel . It's read method reads into a ByteBuffer . You combine it with a java.nio.channels.Selector to read from multiple sockets without blocking (the selector helps you find out which one has data available) but in your case you may simply be happy with the SocketChannel .

It is a lot harder to use though - there is no InputStream and you need to manage the ByteBuffer .

Another alternative is to start a watchdog Thread that sleeps for the duration of your timeout and then closes the Socket if the client hasn't sent the PIN yet. Closing the socket will interrupt a blocked reader.

Some older questions to help you with the SocketChannel if you want to go that way:

The server should always use a read timeout. You can vary it after the first request and response.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM