We store dates in the DB as a diff between the date and '1-Jan-1970' in minutes.
Here're the results from the same machine:
--TSQL (MS Sql Server 2008R2)
SELECT DATEADD(minute, 22572765, '1-Jan-1970'); -- gets 1-Dec-2012 12:45:00.000
//JS (Chrome) - assume minDate = 1-Jan-1970
new Date(minDate.getTime() + (22572765 * 60 * 1000)) -- gets me 1-Jan-2013 12:45:00
What's the best way to solve the discrepancy?
It sounds to me like your minDate
is wrong, because this:
new Date((new Date(1970, 0, 1)).getTime() + (22572765 * 60 * 1000))
...gives me 1-Dec-2012 12:45:00.000 GMT.
In JavaScript, the month
value is zero-based, so if you're constructing minDate
with new Date(1970, 1, 1)
, that's why you're a month out.
Also note that unless you use the UTC
variants of things ( getUTCYear
, etc.), JavaScript dates will give you the local timezone version of the date on output (your way of constructing the date is correct other than minDate
, it's just on output you need to be careful).
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.