From fuzzy to sharp – the dinosaur in the room

Figure 1 - Classic rabbit ear television antenna. By Bernd from the Wikimedia Commons and in the public domain.
Figure 1 – Classic rabbit ear television antenna. By Bernd from the Wikimedia Commons and in the public domain.

The evolution of broadcast television and its decline are closely related to the evolution and ascent of digital photography.  If you go back to the fifties and consider the images from these early televisions they were highly inferior to contemporary silver gelatin photography.  This is closely related to how they were made. Video cameras worked essentially on an image dissector principle.  Light was focused with a camera lens on a phosphor screen which emitted a beam of electrons out its back side.  These were tamed into reasonable focus with a magnetic field.  A small portion of this beam of electrons was allowed to pass through an aperture and onto a photocell.  This was accomplished either electronically or mechanically.  The entire stream was sequentially scanned.  The process was reversed on the tv monitor.  A beam of electrons was focused and scanned across a phosphor screen.

In the first demonstration of television in 1926 by John Logie Baird, the scanning was accomplished using a Nipkow disk, where the optical field is scanned using a spiral of points (see Figure 1). Aficionados will recognize the important role today played by Nipkow disks in confocal microscopy.

This whole process didn’t lend itself to either high resolution or high dynamic range images.  However it did lend itself very well to analogue image transmission first across wires and then wirelessly. Hence the images were fuzzy, and we preferred silver gelatin photographs.

So how did technology evolve to where it is today?  First the whole scanning process improved, which explains why television in the United States was inferior to television in Europe.  The US standardized before higher definition was technically possible.

And then something transitional happened.  In 1969 the CCD was invented by George Smith and Willard Boyle at Bell Labs.  Scanning was no longer necessary and resolution on the camera side was defined by how small and how many pixels you can pack into your sensor.  And of course, when the computer world moved away from cathode ray (scanning) tubes to solid-state pixel arrays of light emitting diodes (LEDs), the die was cast for high resolution digital television and digital.

I believe that the invention of the CCD and the LED diode array  were the key new species in the technological forest.  Home computers, video games, Facebook all owe their wide acceptance and technical dominance to these two inventions.  These in turn are rapidly consuming the broadcast television base.  Indeed, it seems likely that the only factor slowing down this process of technological evolution is the length of a human lifetime.  The older you get, generally the slower you are to adapt.  We, not the rabbit eared television, are the true, recalcitrant dinosaurs.

 

Close Menu