Looking over the previous questions and answers it appeared this should work :
var palindrome = new Date('2011-11-11');
var december = new Date('2011-11-11');
december.setDate(palindrome.getDate()+20);
//should be december, but in fact loops back over to Nov 1st)
my jsFiddle
is there a simple way to ensure that months are incremented correctly, or have I missed something obvious ?
You could do it like this:
var dayOffset = 20;
var millisecondOffset = dayOffset * 24 * 60 * 60 * 1000;
december.setTime(december.getTime() + millisecondOffset);
The getMonth()
call returns a value between 0 and 11, where 0 is January and 11 is December, so 10 means November. You need to increment the value by 1 when using it in a string. If you simply output it as a string you'll see that it has the correct date. Note I also had to change the starting date format. It didn't seem to like 2011-11-11
so I made it 11/11/2011
. http://jsfiddle.net/9HLSW/
Your code is correct, however you are converting it to a string wrongly .
getMonth()
starts with 0 as January, and ends with 11 as December. So all you need to do is add 1 to the month like this:
alert(endDate.getFullYear() + "-" + (endDate.getMonth()+1) +"-"+ endDate.getDate());
Notice the additional brackets - cos you are performing a math operation while concatenating strings. You won't want to end up with " 101 " as the month.
To see whether you got the date correct, use endDate.toDateString()
to display the date in a fully qualified name (ie: January - December).
alert(endDate.toDateString());
For more info on the Date object, check out this section in w3schools
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.