June may seem a little early to be thinking of Christmas but there’s something going on in the stores now that merits our attention. While Apple’s iPad dominates the tech headlines and 3-D pops off the movie screens, the slow-motion tumble of high-definition television into our living rooms appears to be coming to an end – with a variety of affordable TV sets, DVD players and internet connections that are, indeed, very good.
A Long and Winding Road
The road to high definition television – a picture that not only displays in a widescreen cinematic aspect ratio but is twice as sharp as the old standard – has been long and tortuous. A consortium of engineers from all the major broadcasters, Hollywood studios, consumer electronic manufacturers, and, not to be dismissed, computer companies came together in 1991 to establish a standard for HDTV broadcasting and began field testing the new signal in 1994.
In 1996, WRAL-TV in Raleigh, North Carolina, won the race to become the first local station to broadcast in high definition. In1998, HDTV got its public launch (to selected science centers and museums) with coast-to-coast coverage of John Glenn’s return to space on the Space Shuttle Discovery.
It was not until the early 2000’s, however, that TV stations settled on a uniform transmission standard. (Just in time for Super Bowl XXXIV on January 30, 2000.) Cable networks and satellite television companies started adopting the new signal in 2002, but Europe did not join the race until a Belgium TV station began transmitting in HD in 2004.
What Kind of HD?
The ten years the industry spent fleshing out the details of HDTV did not yield any easy answers. Camera manufacturers began the decade in an analog world debating how many lines should make up the new TV picture. (480, 720 or 1080? Interlaced or progressively scanned?) They ended it in a digital era that defines a good picture by how many bits per second a camera can pour into a data stream.
The high definition signal, when you looked at it straight out of the camera, knocked your eyeballs out. But getting it into our living rooms, the engineers concluded, would require new frequencies, new broadcast towers, new TV sets, and a whole new way of making TV shows.
Editors struggled to come up with compression algorithms that could crunch the massive new video files into data their desktop computers could handle. Broadcast engineers, meanwhile, sliced and diced the new bandwidth assigned to their station, weighing the advantage of better picture quality against the potential profits of splitting it into multiple channels, waiting for their bosses, the money men, to tell them which way to go.
To implement high definition television in the United States, the
Federal Communications Commission effectively re-aligned much of the spectrum of the public airwaves. Broadcast television signals were moved to a higher frequency, and the old signal paths were re-assigned for public emergencies or auctioned off for future use by mobile phones and mobile TV (*watch for something called Flo TV from Qualcomm.)
The Perils of Being an Early Adopter
Early adopters, sometimes called “lighthouse customers”, paid a price for being the first on the block with an HD television set. Viewers of that first Superbowl in HD did so on TV sets costing $3,000 or more. Subscribers to Comcast’s first cable HD service in 2003 could watch only a handful of programs produced at that quality. The broadcasters were in no rush to build expensive new HD transmitters, and, even as prices fell on the widescreen units, most consumers were inclined to believe the old TV was plenty good enough. (This wasn’t, after all, like going from black & white to color.)
Originally, the FCC wanted to have the switch from analog to digital TV occur on December 31, 2006. But Congress twice delayed the deadline and it did not take place until June 11, 2009.
King Richard’s Nail
For want of a nail, a shoe was lost; for want of a shoe, a horse was lost. For want of a horse, a battle was lost and that is how, the rhyme goes, King Richard lost his kingdom.
In the transition to high definition television, the equivalent of a nail was the cable connecting a satellite or cable box to the TV set. It wasn’t until 2006 that a Silicon Valley company came up with a cable to iron out this last kink in the high definition chain. It is called an HDMI (high definition multimedia interface) connector and it replaces the old spaghetti bowl of analog cables behind your TV set with a single wire that passes audio, video and computer data between devices as an uncompressed digital signal. In only a few years, it has become the connector of choice on all digital TV devices and many computers.
This week marks the first anniversary of the switchover of American TV from analog to digital. (Next year, Canada and Japan will make the switch; Britain will change over in 2012, and China in 2015.) There were no major disruptions in service. Thanks to last-minute government subsidies for DTV conversion boxes, only 2 million homes lost their TV access when the analog broadcast towers went dark. A Home Media Magazine survey conducted last November showed 33 to 50 percent of Americans now have at lease one HDTV in their homes, and 66 percent say they subscribe to HD television services.
All the major networks and cable channels are producing new programs to HD specifications and stores are selling HDTV sets for roughly the same price as the old TV’s ($199 for a 19” model).
The last gray area in the consumer products arena has been DVD’s, but that too is changing fast, propelled by Hollywood’s dismay over losing DVD sales to alternative Internet delivery systems. (In 2009, DVD sales dropped 13 percent from the previous year, from $10 billion to $8.73 billion.)
In a best of all worlds scenario, high definition DVD players would have been on the same upgrade curve as other HDTV components. But six years ago, Hollywood studios became embroiled in a kind of Betamax vs. VHS format war between Toshiba’s HD DVD format and Sony’s Blu-Ray disc. One of the sticking points was how each format handled piracy issues through what is called Digital Rights Management (DRM). Few studios would fully commit to either until that was resolved.
In February 2008, Toshiba folded its tent on HD DVD. Sony’s Blu-Ray format became the de facto standard. With its spectacular picture quality, 25 gigabytes of storage, and backwards-compatible ability to play old DVD’s (lacking in early versions), it is a worthy successor. Hollywood studios can pay extra to license Sony’s DRM copy protection, but individuals and small production companies can burn discs on their own at no extra cost.
For video gamers and people who fancy their living rooms as home entertainment centers, Blu-Ray opens up a whole new audio world of theater-quality surround sound. For the “late adopters”––that 50 percent of America who have held off buying into high definition TV––the cost of an upgrade is now manageable.
The price of Blu-Ray DVD players has dropped sharply in the stores from an initial $400 to $150, or less. Even in a recession, even with an incremental price difference, Blu-Ray DVD versions of movies are selling at about the same rate as DVD’s did when they were introduced as a replacement for VHS. (In April, the Blu-Ray DVD version of Avatar sold 1.2 million copies on its first day of release.) And computer manufacturers are starting to include Blu-Ray drives as part of the upgrade to Windows Seven and Intel-based Apple systems.
For the first time in 15 years, all the kinks in getting the high definition signal from the camera to your television set have been worked out. And for the first time, you can see the difference at no extra cost.
It’s going to be a very good Christmas for Blu-Ray manufacturers this year. Now the engineers can spend the next decade finding a way to universally deliver the same high definition signal to everyone, everywhere, on any device over the Internet.