[ntp:questions] garmin 18x and linux

Chris Albertson albertson.chris at gmail.com
Sat Sep 10 00:29:20 UTC 2011

"...The organization  of the measurements is very similar to the
setup applied  at BIPM
and Besanqon in the experiment carried out in  1996 (Lewandowski et
al.,  1997). The
Stanford SR-620 time-interval  counter  (Fig* 1) is started by the  1
pps pulse from the
local  UTC  clock  driven  by  the  Oscilloquartz  EUDICS 3020  cesium
standard. The counter is stopped by  the  1 pps pulse from the
Motorola Oncore VP

So they are measuring the offset between a local  cesium clock and the
output of an Motorola Oncore type GPS.

On Fri, Sep 9, 2011 at 3:38 PM, unruh <unruh at wormhole.physics.ubc.ca> wrote:
> On 2011-09-09, Chris Albertson <albertson.chris at gmail.com> wrote:
>>>> What we are talking about when we say "accuracy" is the amount of uncertainty.
>>> No we mean how closely the gps absolute time agrees with the utc
>>> absolute time (modulo leap seconds).
>> Accuracy is always expressed as an uncertainty.  For example I might
>> say  "the voltage is 6.32V plus or minus my meter's 2% error"  This
>> means I'm un-certain of the true voltage in the wire but think the
>> value I gave is within 2% of "truth"
> Actually there are two kinds of errors-- systematic and random. Thus,
> the troposphere causes a delay in the signal of about
> 8.1ns/sin(elevation) where elevation is the angle above the horizon of
> the sattelite. The ionisphere has an error of about 9ms/sin(theta) but
> with an random error of about plus or minus 4.5ns.

A timing GPS has an advantage.  it knows it's antenna location.  and
it knows the location is not changing.   I think this allows for  a
"best fit" solution using all of the eight satellites that it tracks.
 I can't figure out how much is gained be doing this.

Yes I know about many possible sources of error.  It could be quite
bad.  But then I read test reports that come out of national standards
labs and these papers get published.  The reports are all pretty much
in agreement that GPS can be astoundingly good.  We can argue that
"the error must be larger" and then we measure and it's not.

I remember years ago when the government still had SA turned on the
the error in location was purposefully kept quite large.   What they
did was encrypt the low order bits of the signal.   That was until
some smart person said "what if I place my GPS at a site that is
surveyed at the centimeter level?  Can't I figure out what the induced
error in the signal is when know the right answer in advance?"  It
turns out the answer was "yes".

I think if you have known location and a good local crystal oscillator
all the GPS receiver needs GPS for is to measure the crystal
oscillator drift.   The GPS does not need to compute and instant
solution each second, it can maintain a running average of the current
time.    I think it's doing about the same thing NTP does.  I think it
tries to phase lock it's internal clock to the eight GPS satellites
that it tracks.

Chris Albertson
Redondo Beach, California

More information about the questions mailing list