Re: [LEAPSECS] DRM broadcast disrupted by leap seconds

From: Markus Kuhn <Markus.Kuhn_at_cl.cam.ac.uk>
Date: Sat, 19 Jul 2003 12:30:07 +0100

Ed Davies wrote on 2003-07-19 09:15 UTC:
> > When the scheduled transmission time
> > arrives for a packet, it is handed with high timing accuracy to the
> > analog-to-digital converter,
>
> I assume you mean digital-to-analog.

Yes, sorry for the typo.

> This also raises the point that because the transmission is delayed a few
> seconds for buffering there is presumably a need for the studio to work
> "in the future" by a few seconds if time signals are to be transmitted
> correctly.

All modern digital broadcast transmission systems introduce significant
delays due to compression and coding. It is therefore common practice
today that the studio clocks run a few seconds (say T = 10 s) early, and
then the signal is delayed by digital buffers between the studio and the
various transmitter chains for T minus the respective transmission and
coding delay. This way, you can achieve that both analog terrestrial and
digital satellite transmissions have rather synchronous audio and video.
Otherwise, your neigbor would already cheer in from of his analogue TV
set, while you still hear on DRM the "live" report about the football
player aproaching the goal.

There are a couple of problem though with delayed "live":

  - One is with the BBC. They insist for nostalgic reasons to transmit
    the Big Bang sound live, which cannot be run 10 seconds early in
    sync with the studio clock.

  - Another are live telephone conversations with untrained members of the
    radio audience who run a loud receiver next to the phone. The delay
    eliminates the risk of feedback whisle, but it now ads echo and
    human confusion. The former can be tackled with DSP techniques, the
    latter is more tricky.

  - The third problem is that in the present generation of digital
    radio receivers (DAB, DRM, WorldSpace, etc.), the authors of the
    spec neglected to standardize the exact buffer delay in the receiver.

For the last reason mostly, the time beeps from digital receivers still
have to be used with great caution today (or are even left out by some
stations who prefer to send none rather than wrong ones).

> > Either having a commonly used standard time without leap seconds (TI),
> > or having TAI widely supported in clocks and APIs would have solved the
> > problem.
>
> Absolutely - and the second suggested solution doesn't need to take 20
> years to be implemented.

The engineer involved in this project to whom I talked was actually very
familiar with my API proposal on

  http://www.cl.cam.ac.uk/~mgk25/time/c/

and agreed that the problem never had come up if that had been widely
supported by Linux, NTP drivers, and GPS receiver manufacturers. But we
are not there yet.

The current discussion on removing leap seconds no doubt also will delay
efforts to make TAI more widely available, because what is the point in
improving the implementations if the spec might change soon
fundamentally.

I don't care much weather we move from UTC to TI, because both
approaches have comparable advantages and drawbacks, which we understand
today probably as good as we ever will. But it would be good to make a
decision rather sooner than later, because the uncertainty that the
discussion generates about how to design new systems developped today
with regard to leap seconds can be far more hassle. It would be
unfortunate if at the end of this discussion we change nothing and all
we have accomplished is to delay setting up mechanisms to deal with leap
seconds properly. I personally feel certainly not motivated to press
ahead with proposals for handling leap seconds better, if there is a
real chance that there might soon be no more of them.

Markus

--
Markus Kuhn, Computer Lab, Univ of Cambridge, GB
http://www.cl.cam.ac.uk/~mgk25/ | __oo_O..O_oo__
Received on Sat Jul 19 2003 - 04:30:23 PDT

This archive was generated by hypermail 2.3.0 : Sat Sep 04 2010 - 09:44:54 PDT