简体   繁体   中英

Set a timeout for recv from socket on Windows

Ih, I think my code is correct but it doesn't work :(

To set a timeout for recv function on windows i know i must use this code:

                    DWORD timeout = 2000;

                 if (setsockopt(listenSocket, SOL_SOCKET, SO_RCVTIMEO, (char*)&timeout, sizeof(DWORD)))
                 { perror("setsockopt");
                return -1;
                 }

But it doesn't work.

The code of my server is:

    SOCKET listenSocket;
SOCKET remoteSocket= INVALID_SOCKET;
SOCKADDR_IN Server_addr;
SOCKADDR_IN Client_addr;
int sin_size;
short port;

int wsastartup;
int ls_result;
WORD wVersionRequested = 0x0202;
WSADATA wsaData;

wsastartup = WSAStartup(wVersionRequested, &wsaData);
if (wsastartup != NO_ERROR) cout << "Errore WSAStartup()" << endl;

listenSocket = socket(AF_INET, SOCK_STREAM, IPPROTO_TCP);

port = 4000;
Server_addr.sin_family = AF_INET;
Server_addr.sin_addr.s_addr = inet_addr("127.0.0.1");

Server_addr.sin_port = htons(port);

if (bind(listenSocket,(LPSOCKADDR) &Server_addr,sizeof(struct sockaddr)) < 0) {
    cout << "Server: error bind." << endl;
closesocket(listenSocket);
return -1;
}

ls_result = listen(listenSocket, SOMAXCONN);

sin_size = sizeof(struct sockaddr_in);
remoteSocket = accept(listenSocket, (struct sockaddr *) &Client_addr, &sin_size);

// SET THE TIME OUT
DWORD timeout = 300;
if (setsockopt(remoteSocket, SOL_SOCKET, SO_RCVTIMEO, (char*)&timeout, sizeof(DWORD)))
{ perror("setsockopt");
    return -1;
}

int i=0;
while (i<50){
    t_start = clock();

    // when client receives the send below it wait 3 seconds and then trasmits the answer
    send(remoteSocket, "code of start transmission", sizeof("code of start transmission"), 0);

    recv_size=recv(remoteSocket, messaggio, sizeof(messaggio), 0);

    printf("time for read=  %f second \n", ((double)(end - t_start)) / CLOCKS_PER_SEC);

    i=i+1;
}

The client when receives the message "code of start transmission" from servers, it wait 3 seconds and then aswer to server. I expect time for read is 300 ms and recv_size<0, instead recv_size<0 but time for read is more or less 1.5 seconds (The server waits for the client's message). I don't understand why.

I'm on windows and i'm using eclipse and mingw-w64.

Please someone can help me??

Your code tries to use the socket after it has timed out. This is not a good idea because the socket is still somewhere in the middle of the failed blocking operation and in no shape to start a new operation. There's no way to unwind the portions of the operation that have previously completed and put the socket back where it was before the operation started.

Once a blocking socket operation times out, all you can safely do is close the socket. There is no support for undoing an operation that is partially completed and leaving the socket in any kind of sane state.

If a send or receive operation times out on a socket, the socket state is indeterminate, and should not be used[.] -- MSDN

The SO_RCVTIMEO socket option should never be used in code that's designed to work with sockets. It's a kludge to prevent infinite waits in code that wasn't designed to work with sockets natively. These aren't the droids you're looking for.

To set a timeout for recv function on windows i know i must use this code:

             DWORD timeout = 2000;

             if (setsockopt(listenSocket, SOL_SOCKET, SO_RCVTIMEO, (char*)&timeout, sizeof(DWORD)))
             { perror("setsockopt");
            return -1;
             }

No. It should be an int , not a DWORD , but the main problem is that youre are here setting an accept() timeout, as this is the listening socket. You need to set it on the accepted socket(s).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM