[ntp:questions] Dispaying nanoseconds

David L. Mills mills at udel.edu
Fri Feb 16 18:00:12 UTC 2007


Richard,

The interpretatino of precision has persisted for the life of the NTPv4 
implementation. Precision <is> the smallest difference in the time the 
clock can represent. While it might appear that the smallest difference 
is the resolution, the operating system call cannot truthfull represent 
differences less than the time it takes to read the clock.

Onced upon a time when the clock resolution was in the milliseconds and 
the time to read the clock was 42 microseconds, the difference between 
resolution and precision wasn't really significant. However, no the 
potential resolution is less than a nanosecond and the time to read the 
clock several hundred nanoseconds and the difference is critical.

Dave

Richard B. gilbert wrote:

> mills at udel.edu wrote:
> 
>> Rune,
>>
>> The vanilla ntptime shows precision, but it really should say 
>> resolution. By definition, precision is the time taken to read the 
>> system clock, ranging from 42 microseconds in a SPARC IPC to 500 
>> nanoseconds in a Sun Blade 1500.
>>
>> Dave
> 
> 
> Dave,
> 
> That appears inconsistent with other NTP usage of precision, at least as 
> I've understood it.  My understanding was that precision was the 
> smallest difference in time that the clock could represent; e.g. the 
> value of the least significant bit.
> 




More information about the questions mailing list