admin管理员组文章数量:1026912
I'm pretty new to JavaScript, but I've got an issue.
I have a HTMl5-Page with 2 input elements of the type time
.
I want to get the difference between the two times, so that 11:15 and 12:00 would give me 0:45.
I think the HTML is pretty straight forward:
<ul>
<li>
<label>Anfang: </label>
</li>
<li>
<input type="time" id="begin" required="true">
</li>
</ul>
<ul>
<li>
<label>Ende: </label>
</li>
<li>
<input type="time" id="end" required="true"></li>
</ul>
Nothing special here I guess. In my JavaScript, I attach a listener to a button which fetches the value of these two fields and passes it to the following function to create date objects out of the time:
function timeToDate(input) {
var hours = input.substring(0, 2);
var min = input.substring(3, 5);
var date = new Date();
date.setHours(hours);
date.setMinutes(min);
return date;
}
to get the difference, I do:
var beginDate = timeToDate(begin);
var endDate = timeToDate(end);
var diff = endDate.getTime() - beginDate.getTime();
var b = new Date();
b.setTime(diff);
console.log(b.getHours());
So, but the console output for 11:00 (beginDate) and 12:00 (endDate) is 2, but it should be 1, or am I not getting the clue here?
I'm pretty new to JavaScript, but I've got an issue.
I have a HTMl5-Page with 2 input elements of the type time
.
I want to get the difference between the two times, so that 11:15 and 12:00 would give me 0:45.
I think the HTML is pretty straight forward:
<ul>
<li>
<label>Anfang: </label>
</li>
<li>
<input type="time" id="begin" required="true">
</li>
</ul>
<ul>
<li>
<label>Ende: </label>
</li>
<li>
<input type="time" id="end" required="true"></li>
</ul>
Nothing special here I guess. In my JavaScript, I attach a listener to a button which fetches the value of these two fields and passes it to the following function to create date objects out of the time:
function timeToDate(input) {
var hours = input.substring(0, 2);
var min = input.substring(3, 5);
var date = new Date();
date.setHours(hours);
date.setMinutes(min);
return date;
}
to get the difference, I do:
var beginDate = timeToDate(begin);
var endDate = timeToDate(end);
var diff = endDate.getTime() - beginDate.getTime();
var b = new Date();
b.setTime(diff);
console.log(b.getHours());
So, but the console output for 11:00 (beginDate) and 12:00 (endDate) is 2, but it should be 1, or am I not getting the clue here?
Share Improve this question edited Aug 2, 2013 at 15:57 Chris Baker 50.7k12 gold badges99 silver badges116 bronze badges asked Aug 2, 2013 at 15:45 MircoMirco 3,0165 gold badges36 silver badges59 bronze badges 2- 1 to != too, sorry for the grammar patrol but I couldn't help myself. – Chris Baker Commented Aug 2, 2013 at 15:58
- I'm sorry, english is not my first language. Nevertheless, you're right - just a typo! – Mirco Commented Aug 2, 2013 at 17:02
3 Answers
Reset to default 5setTime()
takes a number of milliseconds UTC.
getHours()
, on the other hands, prints the hours in your local timezone.
The "one hour too much" you see, when you print getHours()
is your time offset from UTC.
You would have the desired hour using getUTCHours()
.
I'm assuming you're using this algorithm only to pute a hours:minutes
representation of a small positive date difference. Then it's fine. Simply don't try to apply this to bigger duration putations.
Using setTime()
is error-prone - since what you are actually setting is milliseconds since January first 1970 in UTC time.
Using getHours
will yield the time of the day in your time zone, which is likely to be different than UTC time.
You will see this if you set the time to 0 - getHours()
will not necessarily yield 0.
Also you will get wraparound effects, so if your time delta is longer than 24 hours you will get a value modulo 24. So in order to calculate the difference between two dates in hours, I suggest you use regular maths instead:
var hours = (diff/(1000*60*60))|0;
I don't think the Date object can be used with a time offset. With your code diff
contains the number of milliseconds between the two times. You can get the number of hours from this by dividing by 60*60*1000:
var hours=diff/(60*60*1000);
This can result in fractions though, for example a difference of 15 minutes would be 0.25 hours. If you need full hours, you can round up or down with Math.ceil()
or Math.floor()
, for example:
console.log("Full hours: "+Math.ceil(diff/(60*60*1000)));
I'm pretty new to JavaScript, but I've got an issue.
I have a HTMl5-Page with 2 input elements of the type time
.
I want to get the difference between the two times, so that 11:15 and 12:00 would give me 0:45.
I think the HTML is pretty straight forward:
<ul>
<li>
<label>Anfang: </label>
</li>
<li>
<input type="time" id="begin" required="true">
</li>
</ul>
<ul>
<li>
<label>Ende: </label>
</li>
<li>
<input type="time" id="end" required="true"></li>
</ul>
Nothing special here I guess. In my JavaScript, I attach a listener to a button which fetches the value of these two fields and passes it to the following function to create date objects out of the time:
function timeToDate(input) {
var hours = input.substring(0, 2);
var min = input.substring(3, 5);
var date = new Date();
date.setHours(hours);
date.setMinutes(min);
return date;
}
to get the difference, I do:
var beginDate = timeToDate(begin);
var endDate = timeToDate(end);
var diff = endDate.getTime() - beginDate.getTime();
var b = new Date();
b.setTime(diff);
console.log(b.getHours());
So, but the console output for 11:00 (beginDate) and 12:00 (endDate) is 2, but it should be 1, or am I not getting the clue here?
I'm pretty new to JavaScript, but I've got an issue.
I have a HTMl5-Page with 2 input elements of the type time
.
I want to get the difference between the two times, so that 11:15 and 12:00 would give me 0:45.
I think the HTML is pretty straight forward:
<ul>
<li>
<label>Anfang: </label>
</li>
<li>
<input type="time" id="begin" required="true">
</li>
</ul>
<ul>
<li>
<label>Ende: </label>
</li>
<li>
<input type="time" id="end" required="true"></li>
</ul>
Nothing special here I guess. In my JavaScript, I attach a listener to a button which fetches the value of these two fields and passes it to the following function to create date objects out of the time:
function timeToDate(input) {
var hours = input.substring(0, 2);
var min = input.substring(3, 5);
var date = new Date();
date.setHours(hours);
date.setMinutes(min);
return date;
}
to get the difference, I do:
var beginDate = timeToDate(begin);
var endDate = timeToDate(end);
var diff = endDate.getTime() - beginDate.getTime();
var b = new Date();
b.setTime(diff);
console.log(b.getHours());
So, but the console output for 11:00 (beginDate) and 12:00 (endDate) is 2, but it should be 1, or am I not getting the clue here?
Share Improve this question edited Aug 2, 2013 at 15:57 Chris Baker 50.7k12 gold badges99 silver badges116 bronze badges asked Aug 2, 2013 at 15:45 MircoMirco 3,0165 gold badges36 silver badges59 bronze badges 2- 1 to != too, sorry for the grammar patrol but I couldn't help myself. – Chris Baker Commented Aug 2, 2013 at 15:58
- I'm sorry, english is not my first language. Nevertheless, you're right - just a typo! – Mirco Commented Aug 2, 2013 at 17:02
3 Answers
Reset to default 5setTime()
takes a number of milliseconds UTC.
getHours()
, on the other hands, prints the hours in your local timezone.
The "one hour too much" you see, when you print getHours()
is your time offset from UTC.
You would have the desired hour using getUTCHours()
.
I'm assuming you're using this algorithm only to pute a hours:minutes
representation of a small positive date difference. Then it's fine. Simply don't try to apply this to bigger duration putations.
Using setTime()
is error-prone - since what you are actually setting is milliseconds since January first 1970 in UTC time.
Using getHours
will yield the time of the day in your time zone, which is likely to be different than UTC time.
You will see this if you set the time to 0 - getHours()
will not necessarily yield 0.
Also you will get wraparound effects, so if your time delta is longer than 24 hours you will get a value modulo 24. So in order to calculate the difference between two dates in hours, I suggest you use regular maths instead:
var hours = (diff/(1000*60*60))|0;
I don't think the Date object can be used with a time offset. With your code diff
contains the number of milliseconds between the two times. You can get the number of hours from this by dividing by 60*60*1000:
var hours=diff/(60*60*1000);
This can result in fractions though, for example a difference of 15 minutes would be 0.25 hours. If you need full hours, you can round up or down with Math.ceil()
or Math.floor()
, for example:
console.log("Full hours: "+Math.ceil(diff/(60*60*1000)));
本文标签: htmlJavaScript Date Differenceone hour too muchStack Overflow
版权声明:本文标题:html - JavaScript Date Difference - one hour too much - Stack Overflow 内容由热心网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://it.en369.cn/questions/1745653535a2161461.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论