Stupid question from programmer: Why not just eliminate ALL UTC corrections ?

From: Scott Moore <samiam_at_moorecad.com>
Date: Tue, 02 Aug 2005 20:32:13 -0700

First of all, I am a programmer who has implemented several timekeeping
systems.
My current preference, borrowed, I'm sure, from Unix, is just to keep
large counters
and convert to UTC. I.e., count seconds since time N, or milliseconds,
or whatever.
With computer time counters moving to 64 bit for the most part, keeping
the number
of seconds since whatever epoch you want to use (including some
arbitrary religious
figure's birthday) is trivial. This method does not really have "skippy"
problems,
like decimal counters do. I.e., if you add a leap second, its not going
to come
up with an invalid time/date because you are converting from raw
seconds/milliseconds
on the spot to come up with the date. I've never personally done
anything about
leap seconds, because the fact that the epoch is (apparently) walking
back a few
seconds isn't going to break anything, so setting the current time from
GPS or
WWV time is going to autocorrect that.

Now to the point. There are basically three kinds of clocks:

1. Atomic.

2. Computer internal (counters, network broadcast time, etc.).

3. Desk clock.

(1) Already has its own time reference. Bummer, the earth is not
that good a time piece, so we have atomic time, and agreement as to
what the "current" value is. Interesting to astronomers, but virtually
nobody else without their own atomic clock.

(2) Is an application that can notice things like subsecond differences
and things like minutes with 61 seconds.

(3) Desk clocks (or wristwatches, etc.) Not that accurate, synced to
the power line, etc. Most people understand that they need to be set
once and a while, if for no other reasons than power outtages.

So lets say we get rid of UTC corrections, leap seconds, hours, everything.
No corrections, because UTC will be defined as "whatever the freaking
planet is doing". If user X is capable if figuring out the current
solar/earth time to 10 places, cool. All that matters is that the time
reference generators like WWV, and Internet time servers have good to
excellent accuracy, I.e., they can calculate solar time from atomic
time with enough places to satisfy everyone concerned.

Now, if you are a computer programmer/maker, and you don't want to see
ANY odd variations, 61 second minutes, ANYTHING, then you use atomic
time, or as close as you can get, using GPS, WWV, network time servers,
whatever. Your programs can use this internal counter to measure
differences,
derive solar time, whatever. Your accuracy for UTC as presented to the
user is going to be a function of the accuracy of the "atomic" internal
time, your having the current UTC offset (to whatever places), and/or your
skill as a programmer.

I would argue that time type (2) is all that needs worrying about, and
computers, the client of that kind of time, convert between internal
units and units to be presented to users all the time. From internal
binary to external decimal, with some N degree of accuracy.

Without leap seconds, those UTC conversions are going to be MORE accurate
to true solar time, but perhaps LESS accurate to each other, since
they won't be tracking the same (wrong) leap second approximation of
solar time. But given all the present technologies available, crystal
clocks, network time distribution, etc., the accuracy is probably going
to be better, probably a lot better, than what is needed.

What I am saying is that computers are grown up enough to deal with
real atomic time, and the conversions to a true solar UTC, without
the crutch of a UTC leap second, hour, half second, millisecond, or
any other deliberately introduced inaccuracy. Astronomers would be
happily calculating UTC to the 10th digit, computers would be stable,
and John Q public can ignore the entire thing just as he does now.

The system would also grandfather nicely. Current systems could ignore
it for years, or each installation could correct the accumulating
difference as they like, as often as they like. The net result of the
"fix" would be just like the "leap hours" proposal, which is to say
that the need to do leap second corrections to internal counters
goes away. What would be new is that network time distribution would
probably start sending an atomic time with a UTC/solar correction
factor with accuracies down to fractions of a second. The astronomers
would be giving us neat reports and algorithms to figure the current
UTC offset more accurately, and probably feel more needed and loved
than they have for some while :-)

So computers spend some time caculating the current solar time vs.
atomic time. A lot of computer time is already wasted figuring out
the weather. How fast the planet is spinning is just another interesting
fact of life, and as necessary to comminicate with us dumb humans
as convertion to decimal is. I suspect if the Earth were REALLY
inaccurate, as in changing unpredictably by several days per year,
it wouldn't even be an issue. We'd already be thinking of a computer's
ability to predict when to eat lunch as essential.

My 2 cents.

--
Samiam is Scott A. Moore
Personal web site: http:/www.moorecad.com/scott
My electronics engineering consulting site: http://www.moorecad.com
ISO 7185 Standard Pascal web site: http://www.moorecad.com/standardpascal
Classic Basic Games web site: http://www.moorecad.com/classicbasic
The IP Pascal web site, a high performance, highly portable ISO 7185 Pascal
compiler system: http://www.moorecad.com/ippas
Good does not always win. But good is more patient.
Received on Tue Aug 02 2005 - 20:42:28 PDT

This archive was generated by hypermail 2.3.0 : Sat Sep 04 2010 - 09:44:55 PDT