Re: [LEAPSECS] Proposal for a Smoothed Coordinated Universal Time (UTS)

From: Tom Van Baak <tvb_at_leapsecond.com>
Date: Wed, 25 Oct 2000 15:09:34 -0700

Markus,

Interesting proposal. It's a distant echo of the first half
of the 1960's, a pre-leap second era, during which both
leap 1/10 seconds and periodic parts in 10^8 frequency
adjustments were made in an attempt to synchronize
atomic time (ftp://maia.usno.navy.mil/ser7/tai-utc.dat).

By 1972 both sub-second leaps and frequency jumps
were abandoned as a practical way to deal with atomic
time synchronization.

So basically you propose replacing a leap second with
1000 consecutive leap milliseconds.


I recall one of the motivations for retiring leap seconds
was to relieve system designers from having to devise
their own continuous (leap-less) time scales as an
alternative to using UTC.

If an autonomous time keeper (e.g., embedded system)
or lazy time keeper (e.g., common workstation, or PC,
or television, or telephone, etc.) can't know about leap
seconds how will it know about leap milliseconds?


One other observation: what class of system is capable
of 1 ms resolution? It seems millions (billions) of electronic
time-keeping devices have 1 second resolution and are
thus capable of implementing a leap second jump should
their users actually require it but it's not clear how one
would adjust, for example, an Intel PC, PDA, or cell phone,
by exactly 1 ms each second.


Some workstations have a binary-friendly (e.g., 1 kHz)
real time clock interrupt rate rather than a decimal-friendly
(e.g., 1000 Hz) clock. Were UTS to be implemented on
these systems it would be preferable to add/delete
1/1024'th second each second for 1024 seconds rather
than your decimal based system.


Perhaps your proposal should permit that the system
designer be given flexibility to set the quantum of the
milli-leap increment to best match the resolution of the
hardware clock or to match the frequency stability
requirements of the application. This, rather than the
arbitrarily 0.1% frequency deviation limit you picked.

Some systems can easily tolerate 1 second jumps;
perhaps other systems cannot tolerate 1 ms jumps
but might tolerate 1 us or 1 ns jumps. Still other
systems may not tolerate any time jumps at all but
would accept a frequency jump that would implement
the 1 second phase shift over some amount of time.

Some systems (e.g., NTSC color television) implement
time jumps by dropping frames. The point I'm making
is that a fixed proposal for exactly 1000 1 ms steps in
time seems overly specific for the wide variety of both
existing and future time keeping devices.

/tvb
http://www.leapsecond.com/
Received on Wed Oct 25 2000 - 15:19:50 PDT

This archive was generated by hypermail 2.3.0 : Sat Sep 04 2010 - 09:44:54 PDT