From: Dr J R Stockton on
In comp.lang.javascript message <15NUn.24$KT3.10(a)newsfe13.iad>, Thu, 24
Jun 2010 18:02:37, Jeremy J Starcher <r3jjs(a)yahoo.com> posted:

>On Thu, 24 Jun 2010 10:33:26 -0700, Ry Nohryb wrote:

>> Do you think that adjusting the operating system's date/time ought to
>> affect a setTimeout(f, ms) or a setInterval(f, ms) ?

It, certainly, ought not to do so. But it might do so.

>In many other situations, adjusting the system clock leads to
>unpredictable events, including possible refiring or skipping of cron
>jobs and the like.

AIUI, CRON jobs are set to fire at specific times. A CRON job set to
fire at 01:30 local should fire whenever 01:30 local occurs. A wise
used does not mindlessly set an event to occur during the missing Spring
hour or the doubled Autumn hour, though in most places avoiding Sundays
will prevent a problem.

>It is perfectly reasonable for software to do something unpredictable
>when something totally unreasonable happens.

But changing the displayed time should NOT affect an interval specified
as a duration.

>But what you say and what the computer understands are not the same
>thing. If the OS only has one timer, how do you suggest it keeps track
>of time passage besides deciding to start at:
> +new Date()+ x milliseconds?

Bu continuing to count its GMT millisecond timer in the normal way and
using it for durations. The displayed time is obtained from a value
offset from that by a time-zone-dependent amount and by a further 18e5
or 36e5 ms in Summer.


A PC has at least two independent clocks, one in the RTC and one using
different hardware (read PCTIM003.TXT, which Google seems to find). The
same seems likely to be true for any computer designed to be turned on
and off.

--
(c) John Stockton, nr London, UK. ?@merlyn.demon.co.uk Turnpike v6.05.
Web <URL:http://www.merlyn.demon.co.uk/> - w. FAQish topics, links, acronyms
PAS EXE etc : <URL:http://www.merlyn.demon.co.uk/programs/> - see 00index.htm
Dates - miscdate.htm estrdate.htm js-dates.htm pas-time.htm critdate.htm etc.
From: VK on
On Jun 25, 12:01 am, Thomas 'PointedEars' Lahn <PointedE...(a)web.de>
wrote:
> It's "DOM Level 0".  There is no specification in the sense of a Web
> standard (yet).

There is a working draft from 2006, left in the misery ever since - it
contains nothing valuable but a lot of question marks:
http://www.w3.org/TR/Window/#window-timers

At the very least for Windows/IE Javascript is heavily based on
different C++ runtimes. In the particular
%System%\System32\jscript.dll imports Msvcrt.dll and from there gets
all its float math and Date manipulation.
So it would be interesting to know the implementation of C++ own
timers and their reaction on OS time change. If OP observations are
correct then probably the "canonical" setTimeout explanation that goes
back from Netscape docs is incomplete up to being misleading, the
proper explanation would be (the added part with asterisk): "The
setTimeout method evaluates an expression or calls a function after a
specified amount of time * since the timer has been set based on the
current system time *"

From: VK on
On Jun 26, 3:44 pm, VK <schools_r...(a)yahoo.com> wrote:
> "The setTimeout method evaluates an expression or calls a function after a
> specified amount of time * since the timer has been set based on the
> current system time *"

Other words
window.setTimeout("window.alert(1)", 10000);
executed at say 2010-06-26 00:01:0000 LST (Local System Time)
literally means:

1. Get LST / 2010-06-26 00:01:0000

2. Get delay (10000ms = 10 sec)

3. Set C++ runtime IRQ to 2010-06-26 00:11:0000 to notify the
Javascript engine * whenever this moment of LST will happen *.
From: VK on
On Jun 26, 3:56 pm, VK <schools_r...(a)yahoo.com> wrote:
> Other words
>  window.setTimeout("window.alert(1)", 10000);
> executed at say 2010-06-26 00:01:0000 LST (Local System Time)
> literally means:
>
> 1. Get LST / 2010-06-26 00:01:0000
>
> 2. Get delay (10000ms = 10 sec)
>
> 3. Set C++ runtime IRQ to 2010-06-26 00:11:0000 to notify the
> Javascript engine * whenever this moment of LST will happen *.

As Google search shows, I am right. C/C++ do not have built-in timer
functionality, and their add-on implementations in OSs are based on
time stamps timePlaced/timeCalled, not on some absolute coordinate. If
so then it is a global laziness oops of non real time OSs.

It may also be interesting that for Windows environments the minimal
delay is 10ms, any smaller will be automatically set to 10ms, so
window.setTimeout("foo()",0) is perfectly valid but equal to
window.setTimeout("foo()",10)

Also the maximum delay for Windows environments is 2147483647ms =~ 596
hours =~ 24.8 days, any bigger value will be set to 2147483647ms. See
http://msdn.microsoft.com/en-us/library/ms644906%28v=VS.85%29.aspx
USER_TIMER_MINIMUM and USER_TIMER_MAXIMUM

From: Thomas 'PointedEars' Lahn on
Dr J R Stockton wrote:

> Jeremy J Starcher <r3jjs(a)yahoo.com> posted:
>> In many other situations, adjusting the system clock leads to
>> unpredictable events, including possible refiring or skipping of cron
>> jobs and the like.
>
> AIUI, CRON jobs are set to fire at specific times. A CRON job set to
> fire at 01:30 local should fire whenever 01:30 local occurs. A wise
> used does not mindlessly set an event to occur during the missing Spring
> hour or the doubled Autumn hour, though in most places avoiding Sundays
> will prevent a problem.

An even wiser person lets their system, and their cron jobs, run on UTC,
which avoids the DST issue, and leaves the textual representation of dates
to the locale.

>> It is perfectly reasonable for software to do something unpredictable
>> when something totally unreasonable happens.
>
> But changing the displayed time should NOT affect an interval specified
> as a duration.

Duration is defined as the interval between two points in time. The only
way to keep the counter up-to-date is to check against the system clock. If
the end point of the interval changes as the system clock is modified, the
result as to whether and when the duration is over must become false.

>>But what you say and what the computer understands are not the same
>>thing. If the OS only has one timer, how do you suggest it keeps track
>>of time passage besides deciding to start at:
>> +new Date()+ x milliseconds?
>
> Bu continuing to count its GMT millisecond timer in the normal way and
> using it for durations.

Since usually a process is not being granted CPU time every millisecond,
this is not going to work. I find it surprising to read this from you as
you appeared to be well-aware of timer tick intervals at around 50 ms,
depending on the system.


PointedEars
--
Prototype.js was written by people who don't know javascript for people
who don't know javascript. People who don't know javascript are not
the best source of advice on designing systems that use javascript.
-- Richard Cornford, cljs, <f806at$ail$1$8300dec7(a)news.demon.co.uk>