[ntp:questions] strange behaviour of ntp peerstats entries.
David L. Mills
mills at udel.edu
Mon Jan 28 18:40:07 UTC 2008
It would seem self evident from the equations that minimizing the delay
variance truly does minimize the offset variance. Further evidence of
that is in the raw versus filtered offset graphs in the architecture
briefings. If nothing else, the filter reduces the variance by some 10
dB. More to the point, emphasis added, the wedge scattergrams show just
how good the filter can be. It selects points near the apex of the
wedge, the others don't matter. You might argue the particular clock
filter algorithm could be improved, but the mission in any case is to
select the points at or near the apex.
While the authors might not have realized it, the filter method you
describe is identical to Cristian's Probabilistic Clock Synchronization
(PCS) methiod described in the literature some years back. The idea is
to discard the outlyer delays beyond a decreasing threshold. In other
words, the tighter the threshold, the more outlyers are tossed out, so
you strike a balance. I argued then and now that it is better to select
the best from among the samples rather than to selectively discard the
There may be merit in an arugment that says the points along the limbs
of the wedge are being ignored. In principle, these points can be found
using a slective filter that searches for an offset/delay ration of 0.5,
which in fact is what the huff-n'-puff filter does. To do this
effectively you need to know the baseline propagation delay, which is
also what the huff-n'-puff filter does. Experiments doing this with
symmetric delays, as agains the asymmetric delays the huff-n'-puff
filter was designed for were inconclusive.
> Oh yes. popcorn suppression is important. I agree. But the filter goes well
> beyond that. My eaction is that on the one hand people keep saying how
> important net load is, and that one does not want to use poll intervals
> that are much smaller than 8 or 10, and on the other hand, throwing away
> 80-90% of the data collected. Remin ds me of the story of Saul, king of the
> Israelites, whose army was besieged, and he mentioned that he was thirsty.
> A few of his soldiers risked everything to get through the enemy lines and
> bring him water. He was so impressed that he poured it all out on the
> ground, in tribute to their courage. I have always found that story an
> incredible insult to the bravery instead.
> The procedure does drastically reduce the variance of the delay, but does
> not much for the variance of the offset, which is of coure what is
> important. Just to bring up chrony again, it uses both a suppression where
> round trips greater than say 1.5 of min are discarded, and data is weighted
> by some power of the invere of the delay.
More information about the questions