[ntp:questions] Re: Windows timekeeping - sudden degradation - why?
David J Taylor
david-taylor at blueyonder.co.not-this-bit.nor-this-part.uk.invalid
Thu Dec 8 15:23:30 UTC 2005
Just doing a little more work on this. I wrote a program to display
(approximately) the resolution of the timer (from a timeGetTime) call, and
got the following results:
- QuickTime Player running (not even playing a video), timer resolution
just under 1ms (about 960 us)
- QuickTime not running, timer resolution seems to step between 15.6ms
(approx) and 10.5ms.
Now these are early results, and my program isn't highly accurate, but it
suggests that the program may not /only/ be the multimedia timer running
or not (or is it more accurate to say without the system timer being
forced into a higher precision?), but also that something is changing the
basic system clock from a 10ms set to a 15ms step? I do recall that there
are a number of different basic clock periods in Windows, different for NT
4.0, 2000 workstation and server, and XP. Each are either (about) 10ms or
15ms.
So when Windows XP SP2 was installed, everyone agreed on one system clock
"frequency" (or should I say timer interrupt frequency), hence the
stability of that system. Now I have two system components or application
software arguing over whether the correct value is 10ms or 15ms. Is this
really likely?
Anyone for or against that? Any idea of which program might be doing
this? Perhaps it's just my code, and the 10/15ms switching isn't actually
happening at all!
Thanks,
David
More information about the questions
mailing list