[ntp:questions] dispersion has high peak when reference clock first appears

Nickolay Orekhov nowhere at mail.ru
Mon Apr 8 12:03:51 UTC 2013


Hello!

I've got external util that estimates quality of synchronization. One of
the clues that it uses is current sys peer dispersion.
When clock goes down for a long period of time it's dispersion filter gets
filled with MAXDISPERSE ( 16.0 )
And than there's a peak dispersion when clock goes up again and get's
selected.

In general I don't think that's logical. Because I have very good
synchronization with low self dispersion and than there will be a peak just
because some clock appeared from nowhere.

I'm thinking about some additional code. For example, one can delay clock
selection until all filter will be filled with good dispersion, which is
not equal to MAXDISPERSE. It will smooth the moment of clock appearance.

What do you think about this issue?


More information about the questions mailing list