[ntp:questions] Re: NTP precision

Richard B. Gilbert rgilbert88 at comcast.net
Thu Sep 22 20:02:10 UTC 2005


Leandro Pfleger de Aguiar wrote:

>Hi 
>
>    I´m using NTP to timestamp digital events in a specifit application. On my application i need to know what is the exact estimated error that my timestamp marks should consider. Some peaple frequently use offset to mean what is the real estimated difference between local and ref clock. Should i consider other parameters like jitter,  precision and accuracy manually ? Remember: i need to say how mutch can my timestamp be wrong.
>
>Tanks Again !
>_______________________________________________
>questions mailing list
>questions at lists.ntp.isc.org
>https://lists.ntp.isc.org/mailman/listinfo/questions
>
>  
>
Just a very picky point first.  You are using the wrong terminology.  
Precision is how finely your system divides time; tens of milliseconds, 
milliseconds, hundreds of micro seconds, tens of microseconds, etc.  
Your system may be able to express time with a precision of 100 
nanoseconds and yet be inaccurate by any amount from a hundred 
nanoseconds to several minutes (or days)!!!!

The amount by which your clock differs from UTC is the accuracy.

The "offset" is the amount by which your local clock differs from the 
server's clock.  The offset is different for each server.
RFC 1305 
<http://www.eecis.udel.edu/%7Emills/database/rfc/rfc1305/> discusses the 
subject in some detail.  Following the link should take you to a 
directory where you will find the document in both PostScript and PDF form.

For a very rough measure you can use one half the round trip delay for 
the server you are synchronized with which is the "bound" on the error 
for transmitting time from sever to client. 

A series of Slides <http://www.eecis.udel.edu/%7Emills/ntp.html> is 
available that illustrates the theory and practice of NTP.  The slides 
are available in PostScript, PDF, and PowerPoint format.




More information about the questions mailing list