简体   繁体   中英

Difference between client and server time

I want to use server time in an php/javascript application. Time to the second is quite important for the app. However there appear to be some discrepancies of up to 3-4 seconds, despite checking the time differences between client and server time.

My angular code is as follows;

 // Pre ajax time (so we can see how long it took to get data from server)
    var timeElapsed = new Date().getTime();

    $http.get('/getServerTime.php')
        .success(function (data) {
            // Calculate the length of the ajax request in seconds
            timeElapsed = new Date().getTime() - timeElapsed;

            // Add that to the returned server time to get the current
            // server time. (data.date is provided by the getServerTime.php                       page)
            $scope.serverTime = data.date + timeElapsed;
            $scope.timeDifference = new Date().getTime() - $scope.serverTime;
        });

Theoretically this should work but discrepancies of up to four seconds are occurring.

Any suggestions or code modifications would be gratefully received.

Its not a good idea to mix client and server time at all. Client time is not reliable at all, since the user can change his time anytime and anyhow he wants. If he feels funny he can make his time 06.06.2066 right now. What would your program do then ? Probably nothing good. Do not mix these up and the best way would probably be to use the server time -only- Except you just need the difference of two time points to evaluate a duration. In this case you can use the client-time since for a difference of timestamps the absolute value doesn't matter (2016 or 2066, who cares... 3 seconds are 3 seconds)

Furthermore there is another problem in your code. If you want to "calculate the current server time" by adding the value returned from the server plus the time elapsed since your ajax call - you're missing the delay your request takes to reach the server.

say you send your request at 0ms - the server will get the request at 20ms and put it's current server-time in the response. then your client will get the result at, say, 50ms. then the calculated server time will be off by 20ms compared to the actual server time.

in the end what you're trying to do seems like a very non-best-practice approach to be fair.

Maybe if you tell us what exactly you want to achieve with this approach we can help you find a better solution

You can configure your server to return it's time in a "Date" header - this is a standard http header - this way you don't need to have a dedicated rest for it: https://en.wikipedia.org/wiki/List_of_HTTP_header_fields

Even so you will only have the timestamp when the request left the server.

If you have the same problem as me and you need to display the time passed since something started on server - the network latency is not much of an issue here. For any other usages you will need to let UTC do the work.

Hope it helps.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM