[ntp:questions] .1 Microsecond Synchronization

Unruh unruh-spam at physics.ubc.ca
Thu Jun 4 17:47:50 UTC 2009

Terje Mathisen <"terje.mathisen at tmsw.no"> writes:

>John Hasler wrote:
>> ScottyG writes:
>>> Has anyone had any experience doing this? Can anyone suggest how to
>>> achieve this accuracy?
>> Talk to the very long baseline radio astronomers.
>>> We do have some budget but this but if I need to spend a whole lot on
>>> this I need to get in front of my management with the reasons.
>> You will need to.  Has someone already gotten in front of them with the
>> reasons for this accuracy?
>> Of course, recording timestamps with 100 nanosecond _precision_ is easy.

>That is probably the real requirement:

>Windows' os clock runs with 100 ns ticks, but the actual resolution and 
>accuracy is normally 10-17 ms.

>That said, yes you can get down to sub-us timestamps, but it requires 
>dedicated hardware to do so:

>First you must have a timing-optimized gps receiver, then you need some 
>hardware which can be slaved to the pps signal from that gps, and which 
>can also be used as the time base of all your logs.

>I.e.a PCI card or similar which your apllications can query directly to 
>get the current timestamp.

Except that the card cannot be queried with that accuracy, and the logs
cannot be written with that accuracy, and the receipt of the data from
the network cannot be done with that accuracy. It is a completely
idiotic requirement. It was like when Canada turned metric, and I got a
phone call from someone who needed to translate the labels on a can for
the contents (eg .1oz of protein). When I gave them 3 gm, this was not
enough. They wanted it to 4 decimal places. 

You can discipline a compter clock to maybe 1usec accuracy ( with
difficulty. ntp cannot do on a normal computer it because of its slow
 response to rate changes due to temperature changes-- 3usec is probably
its limit) If you have a server with temperature controlled oscillator,

Somebody has absolutely no idea what they are talking about when they
set that limit. In fact if queried I suspect they do not know what the
difference is between a microsecond and a millisecond.

More information about the questions mailing list