New Standards for Standards
nicholas.negroponte
wired 5.11

I used to think that anybody who worried about standards was boring (perhaps because they were). Now I seem to be one of them.

One of the biggest problems any traveler has with laptop computing, especially in Europe, is the plugs. Europeans have more than 20 different formats, with the only semblance of a standard coming from that offered to power an electric razor. There is actually a committee addressing the so-called Europlug - some estimates range upward of a quarter century before such a standard can be implemented, if found, and then only at huge cost.

Atoms require enormous effort. Agreeing on physical form is very hard. This is not limited to the manufacturing specifications for metal and plastic machinery. The first two months of the 1968 Vietnam peace negotiations in Paris were devoted to determining the shape of the table.

Television finally is about bits
For a long time, innovation in television was more like the design of tables than the design of bitstreams. In 1991, for example, Jae Lim, a senior professor at MIT, announced to the world that "we finally agree on one thing: pictures of the future will have an aspect ratio of 16:9." Mind you, this silly thought was progressive thinking at the time. Most people wanted to commit their great-grandchildren to a specific number of scan lines and a fixed frame rate as well.

Part of the reason this parochial view existed was that almost nobody under 40 seemed to care, while the average age of television engineers was over 50. Zenith, which had both TV and computer divisions, was one of the few companies in a position to make a difference, until a new CEO sold half the company - and, sadly, sold the wrong one.

Generally speaking, it is fair to say that advanced television backed into being digital (sorry) for reasons of digital data compression - to more efficiently use expensive satellite transponders - and digital error correction - to more effectively use a decaying cable plant. While those are not necessarily wrong reasons, they are not the right ones, either. The right ones have to do with all the assets of new content that come from the digital world, not the least of these being a new facility for standards. Bits are easier than plastic.

Modems do it right
Plugs don't handshake. Modems do. This process is not too different from dogs sniffing each other. Modems try their best to communicate at the fastest possible speed, using whatever common error correction they share. Today, some of this is controlled in software; tomorrow all of it can and will be.

The reason this works is simple: people have agreed on headers and metadescriptions. In other words, the standard is about how you will describe yourself, not what you are. This is important not only for massive globalization, but also for upgrading and future change.

TV of the future - at least in the US, thanks to the Federal Communications Commission - will be flexible. In spite of the broadcast industry, the FCC refused to set anything but transmission standards. The result will be a slow blend of the Web (as kids know it) and TV (as baby boomers knew it). How a signal arrives, by land or by air; where it comes from, near or far; and what it looks like, a postage stamp or HDTV - all will be described in the signal, not decided by folks in Geneva or Washington, DC.

Higher standards
What the standards bodies need to do is turn their attention to some of the larger issues: while God may be in the details, a great deal needs to be said about the broad brush. The reason to make global standards is global communications. This means people communicating with people. And people have the biggest standards problem of all - they often don't speak the same language.

If a Martian were to turn an ear toward our planet, conversations around the world would sound like modems unable to communicate with each other. In the face of today's digital globalization, it would be hard to explain the thousand-plus written languages and the scores of spoken dialects.

On the other hand, people constantly question the digital dominance of English. Yet, as I like to remind them, we are glad that a French pilot lands an Airbus at Charles de Gaulle airport speaking English to the tower, as it means that other planes in the vicinity can understand. English as a second language, with or without computers, has become an international protocol of sorts and an accepted means of traffic control - even ship to shore.

In the same way, English will continue to be the air traffic control language of the Net 10 years from now. But it will stop being the dominant carrier of content - English will be replaced by Chinese. Still, all sorts of other languages will flourish as well. I remember once defending small cultures and native tongues in these pages (see "Pluralistic, Not Imperialistic," Wired 4.03, page 216), only to be told by a reader that I got it all wrong. The issue, he said, was not English versus language X, but English versus ASCII. Boy, was he right.

The ASCII standard is a huge problem, not the least of it being the insufficient number of bits for kanji characters or calligraphic fonts. In fact, without taking much note of this limitation, we have cemented ASCII into place in a far more entrenched fashion than English. We had better learn a lesson - and quickly. That lesson, however, is not to invent another Esperanto, but to realize that our bitstreams will be in different languages, which need some standard headers.

Making the Net multilingual-ready is even more important than setting the metastandards for our modems and TVs. International bodies must recognize that a higher level of communications standard is needed to make sure that all languages are equally accommodated and self-descriptive. The 5 billion people not using the Net today have a lot to say. Kids know that.