Time-of-day inventory (was: more media...)

From: Rob Seaman <seaman_at_NOAO.EDU>
Date: Thu, 24 Jul 2003 07:29:36 -0700 (MST)

I said:

>> .... One would expect that at least as many "applications"
>> worldwide depend on time-of-day as depend on date formats. ....

Ed Davies avers:

> Sorry, but this one doesn't expect anything of the sort. It
> seems to me that many more applications are interested in time
> durations than in the exact orientation of the Earth.

Um, I asserted a statement of inequality:

    time-of-day application count >= date-format application count

And Mr. Davies counters with a statement of inequality that:

    time-duration application count >> time-of-day application count

relying on the implicit assumption that:

    Earth-orientation applications ~= time-of-day applications

For the sake of argument, I'll accept this assumption for now.
However, the truth or falsehood of Mr. Davies statement doesn't
reveal anything about the truth or falsehood of my statement.

The paltry few people currently having this conversation about leap
seconds (75 on this list, 38 at the Torino colloquium, a few dozen
in the antechambers of the Time Lords - several of whom overlap from
each list) - these few people have yet to undertake anything even
vaguely resembing an inventory of international civil uses of either
time-of-day or interval time world wide. Instead, the folks who
believe themselves to be charged with stewardship of the UTC standard
have focused exclusively on highly technical uses of an undifferentiated
notion of time.

UTC (~=GMT) is time-of-day. TAI is interval time. They aren't the
same thing. Why should we attempt to make them so? The central
cleverness of UTC is that it provides a mechanism for distributing
knowledge about both GMT and TAI with one time signal. Is this really
necessary anymore?

Why don't we just admit the continuing international need to distribute
both time-of-day signals AND interval time signals and stop musing about
lazy crackpot schemes that make the implicit, intellectually bankrupt
statement that in the third millennium our pasty-skinned, big-domed
Outer Limits descendents will have no need to distinguish night from day?

> Perhaps a way of moving the discussion on would be to make a list of
> applications requiring accurate Earth orientation information:

Wonderful idea. I'm not going to rise to the bait of generating such
an ad hoc list in my spare time. The point of my assertions comparing
the looming leap second debacle to Y2K is that the cost of generating
a reliable inventory of such applications (and more importantly, of
generating the inventory of applications reliably shown NOT to depend
on time-of-day/Earth orientation) - that this cost is many orders of
magnitude higher than 75 techno-bozos stealing time from their day jobs
to contemplate a time standard and feeling alternately amused and outraged.

Time-of-day is a concept that predates civilization. The sun rises - so
do the cavemen. The sun sets - the men return to the cave. That we have
refined and extended our manifold needs for time-of-day (explicitly NOT
to mention interval time) should come as no surprise to any of the
descendents of our shared ancestors.

Sure, we could just make a stab in the dark that nothing major might
break and simply cut all the world's clocks loose from time-of-day. But
that wasn't our stategy with Y2K - to avoid disaster at the millennium
we first generated an inventory of applications that even potentially
depended on date format, and then we invested time, money and personnel
to vet each of those prioritized applications. After modifying the
applications, we tested them.

Why is a fundamental change to the design of international time standards
being treated with less respect than a simple goofy-ass conceptual bug
in the width of the year field in software?

The fundamental issue with Y2K was that folks wrote code and established
coding practices that persisted for a few decades rather than the few
years that the original programmers might have been satisfied with.
Nobody expected code written in the sixties (or the eighties, for that
matter) to still be running at the end of the nineties.

The notion of time-of-day is as old as man, of course, but was clarified
and written into standards in the eighteenth and nineteenth centuries.
Our baseline is centuries if not millennia - shouldn't we take
correspondingly longer and invest correspondingly more care in reaching
a decision with even wider potential implications for society?

We are about four years into the leap second debate. Only now have a
few articles been published in a variety of forward looking publications
(the AAS Newsletter, Nature, the Guardian, and a couple more). Four
years into the Y2K debate, thousands - if not millions - of professionals
from hundreds - if not of thousands - of fields were participating
in the discussion. Innumerable articles had been published on the Y2K
crisis at the corresponding point of its trajectory. Congressional
hearings (and the equivalent in other countries) had been held.
Y2K had appeared as the butt of jokes in Doonesbury and on SNL.

I don't understand why the precision timing community is treating a
fundamental change to their fundamental standard with less respect
(or at least, attention) than a bug in a data structure.

Rob Seaman
National Optical Astronomy Observatory
Received on Thu Jul 24 2003 - 07:30:01 PDT

This archive was generated by hypermail 2.3.0 : Sat Sep 04 2010 - 09:44:54 PDT