[ntp:questions] Is dispersion > jitter in all situations

Danny Mayer mayer at ntp.org
Tue Jan 5 04:33:26 UTC 2010


B wrote:
> Hi, this really concerns me!
> 
> An indicator of expecting time is important and I know about the error
> bounds, but I want to use a value(jitter) as
> an indicator of the expecting time relative offset. The jitter isn't
> introduced before NTPv4
> and where I am doing my master thesis they are using NTPv3(RFC-1305).
> 

You might want to ask Uppsala admins why they are using such an old
version. It hasn't been supported for years and most Unix O/S's are now
shipping V4 and have done so for years. You could also install your own
version of NTP on your own system so that you can conduct your own
experiments.

> Is it possible to use dispersion relative offset as an indicator of
> expecting time?
> 
> My idea, peer.dispersion represents the maximum error in offset and
> maximum error of half the roundtrip delay. If dispersion is bigger
> than jitter, ie jitter is bounded by dispersion, then dispersion could
> be used as an indicator of expecting time relative offset.

Jitter is defined in Section 4 of the draft NTP V4 RFC as follows:

"The jitter (psi) is defined as the root-mean-square (RMS) average of
the most recent offset differences, represents the nominal error in
estimating the offset."

Dispersion however is also defined in the same Section 4 as follows:

"The dispersion (epsilon) represents the maximum error inherent in the
measurement."

These are very different from each other. The first gives you
information about how much the offset is varying when receiving NTP
packets while dispersion is a measure of your ability to measure a value
but gives no indication of the current value of the jitter.

What exactly are you trying to accomplish here?

Danny

-- 
This message has been scanned for viruses and
dangerous content by MailScanner, and is
believed to be clean.




More information about the questions mailing list