[ntp:questions] Using both Server and Peer in ntp.conf

Uwe Klein uwe_klein_habertwedt at t-online.de
Thu Feb 8 20:11:58 UTC 2007

Richard B. gilbert wrote:
> Uwe Klein wrote:
>> A question in this context:
>> How is an orphaned server that was well
>> synced for some time degraded in
>> quality and/or stratum
>> with/without the local clock
>> being used as (additional) server
>> over time?
>> uwe
> The answer is: It depends!   (God I love to be helpful!!!!)
We value intention ;-)
> When the server loses its connection to the upstream source, ntpd 
> continues to discipline the clock using the last known good frequency 
> correction.  It cannot compensate for variations in the environment, of 
> which the temperature is the most important.
> How many minutes, hours, or days of "holdover" with reasonably correct 
> time you may get depends on the quality of the local clock, the 
> stability of the temperature, the phase of the moon and the whims of the 
> gods!
> If you must have the correct time, take precautions such as getting one 
> or more hardware reference clocks, redundant internet connections, 
> redundant servers, etc.
> It might be interesting to try the experiment!  Get a server "well 
> synchronized", "orphan" it, and plot the deviation of the clock from the 
> correct time versus elapsed time.  (It's possible that you would be 
> wasting your time; someone may already have done this.)
Is the ntp algo similar enough to a kalman filter to have an
idea of "value" drift and noise/quality ? I think so, right?

With the then known quality of the local clock one should be able to
"deteriorate" the reported clock time/quality over time while contact
is lost.

i.e. if you plot offset and jitter in yerror style you should get a slightly
inclined (drift:up/down) "trumpet" (jitter:increasing)  ?

Q: stratum is a purely hierarchical thing (hop distance from first class sync
source) ?


More information about the questions mailing list