[ntp:questions] Help: fudge time2 value for NMEA driver
Brian.Inglis at SystematicSw.ab.ca
Tue Nov 1 13:31:28 UTC 2016
On 2016-10-31 20:46, ogre up wrote:
> Brian, thanks for your quick response.
> On Tue, Nov 1, 2016 at 4:03 AM, Brian Inglis <Brian.Inglis at systematicsw.ab.ca <mailto:Brian.Inglis at systematicsw.ab.ca>> wrote:
> Presuming a Garmin 18LVC wired to power and a serial port as you have PPS,
> you get better results using the built-in NMEA PPS support as follows:
> symlink /dev/pps0 as /dev/gpspps0, possibly using udev if used by CentOS,
> set NMEA flag1 1 flag3 1, and drop PPS driver .22.0; ...
> I've measured the delay between "end of $GPRMC sentence" and PPS signal by
> an oscilloscope (which is about 213 ms), so I intentionally disabled built-in NMEA
> PPS support and add ATOM driver, expecting NTP would produce same offset value.
> Since "end of $GPRMC" is 213 ms behind, offset of .20.0 should be +213.
You need to measure the time between PPS trigger and end of line character \n, as
messages are assumed to relate to the preceding PPS, so you may need to use 787ms.
You also need to compensate for serial interrupt, character, and message processing
delays, which is why the recommendation is to let the NMEA driver do this by handling
the PPS processing internally.
> - output only $GPRMC as others slow down the Garmin and prevent proper
> PPS handling in the NMEA driver;
> - reduce PPS length to minimum (20ms?);
> - set speed to 9600bps, to reduce I/O and offset time;
> set corresponding mode 17, and set time2 to 0.500 - as long as it is large
> enough to account for the Garmin message output delay 400ms+ NMEA will use
> the PPS timestamp instead of the end of line character time.
> If you want to set an accurate time2, start with it at 0, run it for a day,
> average the peerstats offset, try that value, and see if it helps or not.
> I will try these later.
> I'm confused by the output of ppstest (some digits ommited to make line shorter):
> $ sudo ppstest /dev/pps0
> trying PPS source "/dev/pps0"
> found PPS source "/dev/pps0"
> ok, found 1 source(s), now start fetching data...
> source 0 - assert 59.999997, sequence: 31814 - clear 60.899947, sequence: 330
> source 0 - assert 60.999980, sequence: 31815 - clear 60.899947, sequence: 330
> source 0 - assert 60.999980, sequence: 31815 - clear 61.899953, sequence: 331
> source 0 - assert 61.999888, sequence: 31816 - clear 61.899953, sequence: 331
> source 0 - assert 61.999888, sequence: 31816 - clear 62.899951, sequence: 332
GPS receivers cold start running on internal time until they receive enough almanac
and ephemeris data to calculate a position/time solution. Older units like Garmin
may output a free running PPS before they have a valid solution. Give them 15 minutes
to warm up and receive all the data before starting tests. If they have been on for
some time, check the PPS settings on the device and reset them appropriately if wrong.
With your scope, you can check your GPS output and RS232 input levels.
If your GPS power is too high or low or your interface impedance is outside specs,
your signals may be outside spec for your RS232 inputs, which may be only 0-5V or
lower these days with TIA-232-F. Your interface could also have contact or wiring
> $ ntpq -pn
> remote refid st t when poll reach delay offset jitter
> *127.127.20.0 .GPS. 0 l 1 8 377 0.000 -115.02 1.120
> o127.127.22.0 .PPS. 0 l 56 64 377 0.000 0.040 0.020
> As I've specified "fudge flag2 1" option for ATOM driver to use the falling edge of
> PPS, when clock is synchronized to .22.0, fraction of time stamp for clear event
> should be nearly zero. In my case the fraction is about 0.9. It doesn't make sense.
Messages will be output when channel processing allows so suffer variable delay
and jitter - NMEA PPS processing substitutes the PPS time for the end of line time
when only a single message is output per second.
If you can set your scope to trigger on DCD leading edge start, or a higher quality
PPS source, and sweep to a bit more than the nominal pulse duration, you should
be able to see the differences between pulse durations and rise and fall times.
For more good advice on Garmin issues see:
Leading edge is synced to the UTC (or initially GPS) second with a sharp rise time;
pulse duration, especially on Garmins, will be some approximation to that requested,
probably performed at a lower scheduling priority, making the duration less accurate;
the trailing edge usually has somewhat longer fall time adding more variance to the
The RS-232 signal specs say:
* For control signals, the transit time through the transition region should be less than 1ms.
* For data and timing signals, the transit time through the transition region should be:
- less than 1ms for bit periods greater than 25ms,
- 4% of the bit period for bit periods between 25ms and 125µs,
- less than 5µs for bit periods less than 125µs.
* The rise and fall times of data and timing signals ideally should be equal,
but in any case vary by no more than a factor of three.
* Maximum signal slew rate should not exceed 30V/us.
Bit time at 4800bps is 200us with transit less than 8us, and at 9600bps is 100us
with transit less than 5us, so rise times could be as slow as specified or as fast
as 1/3 of those times, assuming the PPS pulse goes through the GPS RS232 interface;
serial driver interrupt processing delays add to the PPS offset and jitter.
Better quality GPS receivers provide very short pulses designed for 50ohm inputs,
allow compensation for antenna and PPS cable delays, and for quantization error,
due to the internal clock rate and drift between that and the GPS solution time,
resulting in a sawtooth PPS error, which may be provided by the GPS receiver as
sawtooth error correction data, which software can use to correct PPS time as seen
on high speed hardware interfaces.
Take care. Thanks, Brian Inglis, Calgary, Alberta, Canada
More information about the questions