Precise time over time

From: Tom Van Baak <>
Date: Fri, 5 Aug 2005 14:53:16 -0700

Here's a thought on precise time and leap seconds.

The computer I used in college had a refrigerator
sized memory with 32K words and now I have an
iPod with 4 GB. CPU speeds have grown from
below 1 MHz to above 4 GHz. Data rates at my
desk grew from 300 bps to 100 Mbps. Moore's
Law still holds true in the computer industry.

Perhaps there's a sort of Moore's Law for timing.

It would seem hourly accuracy was sufficient
for the common man a couple of hundred years
ago. In my parents or grandparents generation
perhaps one minute accuracy was sufficient.

Things are more precise now; trains in Japan
or Europe, synchronized traffic lights, TV show
times, TiVo, 911 calls, ATM withdrawls, top of
the hour radio broadcasts, stock and currency
transactions, and such. My favorite example is
eBay. It seems we are now in a world pushing
an accuracy level of 1 second. (though, there
are plenty of farmers who just need to know
what day Spring begins)

OK, take these with a grain of salt, but it does
seem there is a definite trend toward exposing
the common man to more precise time over
the generations. Social and technological
infrastructure easily evolves to support this.

In the past 50 years the precise time community
has improved frequency standards at the rate of
a decade per decade.

So my question is this. Would it be reasonable
to expect that a generation or two from now, the
common man will need sub-second accuracy?

What unimaginable applications will there be
that require this? Star Trek aside, 50 years ago
I would not have envisioned microsecond timing
to support cell phones, nanosecond timing from
GPS, UT1 via VLBI, TAI at 1e-15, atomic clocks
on eBay, Hubble, a Mars Rover, a Deep Impact
comet mission, or even inventions like a quartz
wristwatch or crystal controlled DAC's in portable
CD players and iPods.

So clearly there will be some amazing devices
and applications in the future, some of which
will be based on ever more precise time.

So what kind of UTC will be needed one or two
generations from now? If every car, door, house,
intersection, TV, thermostat, phone, iPod, PDA,
soldier, pet, and child must be sync'd to the
millisecond in order for the unimagined apps
of the future, do we really want leap seconds
affecting (or is it infecting) billions of machines?

In the 60's or early 70's when leap seconds
started it's hard to imagine they affected more
than a handful of people or systems. I remember
the first leap second in 1972. Aside from a note
in the newspaper and a missing WWV tick there
was no effect - on computers, or telephones, or
mainframes, or databases; there was no GPS,
no internet, etc. Today UTC and leap seconds
directly or indirectly affect millions of PC's, cell
phones, appliances, satellites, and who knows
what else.

And all of it in a very weird way - where a posting
at the IERS web site, or an email passed around
like a chain letter, gets to the right people who
make the right moves to update the right tables
in the right machines so that a few months from
now the right thing happens.

I just can't see this leap second stuff working
more than a generation or two into the future
if there's a Moore's Law-like trend in timing.

Perhaps it's not the 1.7 ms/cy/cy deceleration
of the Earth that will cause leap second trouble
in the distant future; it's the 60x per N-hundred
years acceleration of precise time in the lives
of the Earth's modern inhabitants that's the
bigger problem.

Received on Fri Aug 05 2005 - 14:55:04 PDT

This archive was generated by hypermail 2.3.0 : Sat Sep 04 2010 - 09:44:55 PDT