简体   繁体   中英

What is the difference between C++ and Java timestamps?

In a project using JSON as an exchange, we encountered the problem that timestamps in milliseconds in C++ and Java are completely different even though both are described as a long primitive data type.

What kind of standards does each language use and why is there a difference?

As an example, 1407315600 is a C++ timestamp which refers to 06.08.2014 09:00:00 UTC while in Java it's unreadable!

Reading timestamps in Java is done using new Date(1407315600) .

Try

new Date(1407315600 * 1000)

The Java date requires milliseconds, the C++ timestamp you have looks like it is in seconds.

In general, the C++ time_t functions give the time in seconds since the epoch.

To get the time in milliseconds in C++ (compared to Java), please refer to this.

C++11

If C++11 is available on the platform, the chrono::high_resolution_clock could be used to obtain a higher resolution (note; the clock may be an alias for one of the other clocks or an implementation defined clock).

#include <iostream>
#include <chrono>

int main()
{
    using namespace std;
    using namespace std::chrono;

    milliseconds ms;
    ms = duration_cast<milliseconds>(high_resolution_clock::now().time_since_epoch());
    cout << ms.count() << endl;
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM