By ROD HOLT
The business of foretelling revolutions has largely been taken away from the oracles, prophets, and madmen that Western civilization depended on in the past. Even as capitalism was barely started, prognostication was just too profitable an activity to be left to amateurs.
This became abundantly clear when some entrepreneur discovered how easy it was to steal enormous sums of money with a little gamble on the future price of Dutch tulip bulbs (1633-37), or on future profits from trade routes to the Indies (in England, 1719, in France, 1720), or on an investment in a stock company to build a canal (the French Panama Canal Co., 1889), or on any dazzling scheme purporting to let you in on the ground floor.
In the 20th century, the public saw that a fortune could be made by holding stock backed only by a patent, an idea. Intellectual property, sanctioned by law1, could be as valuable as an oil well or a factory.
Imagine the commercial value of a patent covering Louis Pasteur’s work. Would one franc for every bottle of wine be a reasonable price to pay to ensure that wine wouldn’t spoil in the bottle? How much would that amount to today? Fortunately for wine drinkers, this honorable 19th-century scientist rarely thought of patenting his work.
People saw the fortunes gathered by the legal monopolies armed with the patents of Alexander Graham Bell, Edison, Goodrich, Fleischman, Eastman, etc. and concluded that, since an invention was as good as a gold mine, speculation was justified. Once people came to believe that an idea could be converted to vast wealth, the fraudulent “scientific genius,” was freed to run amuck. The seer was now selling stock with intoxicating visions of the future.
After World War II, a series of inventions and “high tech” products captivated the stock market. Touted as revolutionary when they appeared, most faded quietly. Consider one disaster that haunts us still: nuclear power. Here was an almost proven technology promising the investor the profits of an unlimited, cheap energy source. But eventually the investors lost. Although resuscitated time and again with taxpayer’s money, the project which had started with a bang went out with a wheeze.
But there were just enough screaming successes to keep investors alert with checkbook at hand. In the semiconductor bubble it was Fairchild, Texas Instruments, and National Semiconductors. In the small computer bubble, Digital Equipment Corp. (DEC) and Wang were the big names.
In tape recorders and VCRs, it was Memorex, Sperry, Ampex, Sony. In the microprocesser and DRAM run-up, it was MOS Technology, RCA, Mostek, Intel, AMD. In the personal computer adventure, it was Apple, Commodore, Northstar, Altair-MITS, IBM, Compac and more.
Half of these names have disappeared, yet since the winners have yielded thousands or tens of thousands for each share of founder’s stock, the investor cannot help but become a gambler and buy into the next hot stock, the next revolutionary idea.
Today the process continues. New companies with new ideas peddle stock at incredible prices to the public while those investors in the know pick the winners. “Bio-tech” companies, last year’s darlings, have been pushed aside by the more strident dot-com companies. We have clever, memorable dot-com addresses as essential on the letter head as the old-fashioned telephone number once was.
In a frantic effort to get your attention, the new companies are supporting a boom in the advertising industry by spending a staggering amount on radio, TV, billboards, newspapers, junk mail, and all the rest.
Every newspaper’s business section reports mergers and buy-outs of companies with names you’ve never heard of. The Internet was the pivot point for the biggest merger in American history, the Time Warner and AOL deal,2 which was worth $350 billion.3 Over one thousand new companies-these dot-coms or their neighbors-were founded in 1999.
Gradually the Internet has become a respectable place to do business. In January, 2000, the World Bank offered $3 billion in bonds and sold $1 billion of that on the Internet.
The lure of real wealth to be made overnight by buying or selling the right stock at the right time has become irresistible. Now anyone with credit can buy stocks on the Internet from Goldman Sachs, Lehman Brothers, Charles Schwab, or eBay.com. Using the Internet, you can buy and sell stocks of companies that are Internet companies. The snake is swallowing its tail.
Why the pandemonium? What is the Internet?
There are hundreds if not thousands of answers given to these questions. Most of the computer “How To” books have at least some history with a bit of froth on top.4 But Berners-Lee’s “Weaving the Web,” (Harper, San Francisco, 1999) is by far the most calm, insightful, and accurate of them all.
Tim Berners-Lee invented the World Wide Web. He gave it its name and abbreviation. He wrote the first functioning code for the Web and made it available to the public free.
Although the European Union (EU) paid all the expenses for Berners-Lee and his group, there was no national chauvinism restraining him. His colleagues in the United States donated as much time and money as any in the world. He gathered together the work of many individuals and laboratories to give us hypertext, the HTML language, http, color graphics, and movies and built an organization to nurture the Web.
It is fascinating to read of his collaboration with so many Internet enthusiasts. He tips his cap to Steve Jobs (of Apple Computer fame) and the NeXT computer he used.
Berners-Lee did not invent the Internet. Ninety-nine percent of people think that everything on the Internet has a “WWW” and a “dot” in its name. As Berners-Lee relates his story it becomes clear what the World Wide Web really is and how and why it is distinct from the Internet. His technical discussion on this and other topics are distributed throughout the narrative in a very natural way. To help the reader, he provides a glossary of technical terms and a good index.
Because Tim Berners-Lee is writing a personal history, we can excuse him for saying little about the first 20 years of the Internet’s life. For the sake of continuity, we will fill in a few blanks.
Starting back in the 1960s when computers were very, very expensive, the Internet grew as a system of interconnections permitting scientists to share computer time. Various schemes were invented for this purpose and the network grew like Topsy.
The U. S. government took notice of this activity and, being reluctant to spend money without some sort of reason, asked the Department of Defense to supervise the network on the grounds that it could be a functional communication system even during an atomic war. They thought the Internet could survive because its complexity provided a jillion routes between any two terminals and at least one route would be open no matter how widespread the devastation.
There were many different computers in use so an early problem for the Internet users was to reach agreement on the ground rules for communication. Thus protocols (abbreviated IP or TCP/IP) were adopted and spread throughout the world.
Many programs have been written that do the work of allowing an individual’s computer to talk freely with other computers using the Internet’s protocols and almost all have been made available free of charge. The popular e-mail program Eudora written by Steve Dorner is free. Free, too, is Gopher (written by volunteers at the U. S. Library of Congress). The famously successful Netscape Navigator was written by a group around Marc Andreessen and it was free up to version 4.
There was a reason for making these and all others free. The Internet was only as useful as its available resources. To expand the resources, Internet supporters appealed to publicly and privately supported schools, universities, and libraries to build data bases and put them on the Internet free. This was the only way because many working groups were barred from using public funds for commercial purposes.
Berners-Lee picks up his story with the late 1980s after the Internet was 20 years old and proceeds in a chronological fashion up to publication of this book in 1999. He joined CERN in 1984 as a computer scientist. CERN has long been one of the world’s best high-energy physics laboratories and it is supported by the European Union (EU).
A very well educated Englishman from a very well educated family of mathematicians, he had developed a passion for connecting and interconnecting every person with every other and with every library and every data base. He says,” The philosophy was: What matters is in the connections.”5
His own statement aside for the moment, he believes that every human anywhere in the world should be able to reach out and obtain every fact, piece of evidence, and knowledge and opinion, whether true or false. Further:
“I have a dream for the Web and it has two parts.
In the first part, the Web becomes a much more powerful means for collaboration between people. In the second part of the dream, collaborations extend to computers. Machines become able of analyzing all the data on the Web-the content, links and transactions between people and computers.”6
Over half the book is dedicated to discussing these goals. Berners-Lee knows he needs a committee of international stature that publishes standards; i.e., documents which tell programmers what they must do to be sure everything works. These standards are also to prevent people from writing special versions of their programs that will work only with their own computers or their own software.
Furthermore, this committee must have the resources to solve unforeseen problems and to improve the capabilities of the system. This is a daunting task but if it is left undone, the World Wide Web will become a nightmare rather than a blessing. This task is basically a political one. How are the conflicting interests of the individual, the state, the commercial enterprise, and the society as a whole to be resolved?
Some of us may smile at the good-willed naiveté of Berners-Lee, but he succeeded. The World Wide Web Consortium was convened in 1994 at MIT’s Laboratory for Computer Science. Most computer manufacturers, universities, and software manufacturers joined the Consortium. Its headquarters was at MIT, and Berners-Lee moved from France to Cambridge, Mass., to act as its head.
The reason the Consortium was possible at that time and still lives is the peculiar stage in the development of the industry it supports.
The hot-house companies of the dot-com world have speculative value today only if they can permeate society tomorrow-and to a far greater extant than did the telephone, the radio, TV, and the door bell taken all together. The dot-com gamblers can expect to do this only by smiling at us, keeping their derringers in their boot tops and collaborating with each other to expand the Web.
Growth is very important to the speculator. Everyone must be brought into the fold, for in the future they expect the Web to educate us, to entertain us, to provide us with our only channel to purchase commodities, to be our bank and post office, and to give us point-and-click companionship when we get old.
This kind of society makes us just as nervous as it does Berners-Lee. He looks at the Consortium and sees no representative for the individual and no one seems to be asking what kind of society is being represented.
In the chapter, “Competition and Consensus,” Berners-Lee continues to describe the dance back and forth between various elements of the Consortium. Despite the tensions inside the Consortium, Berners-Lee wants to show how technical expertise can head off trouble at the pass.
The bi-partisan “Communications Decency Act” came up in Congress, supported by the Christian Right in 1995. This proposed set of regulations would allow censoring the Web and the Internet and granted police power to everyone from a bureaucrat to a child’s parent.
The Consortium dodged that bullet with a clever scheme for adding a tag to a package of Web material so that if the tag had, for example, a “morality rating number” on it, it could be cut out by the user to spare Grandma embarrassment. This way, the user was responsible for the censorship, not the Web people nor the service providers. “The Communications Decency Act” passed Congress but died in the courts.
Berners-Lee finishes the book with a series of chapters wrestling with social problems and morals. There are many of them.
Suppose someone posts a slanderous story about you on a public bulletin board? Hundreds of thousands will have seen it before you have even heard of it. What do you do? Should anonymity be allowed on the Web? On the other hand, can one have privacy without anonymity?
The problems are much more interesting and intricate than they sound here. His chapter on privacy alone is worth the price of the book.
Berners-Lee shows so very clearly the limits of the non-Marxist viewpoint. Without any generalized critique of the social forces at work, he just doesn’t see the basis of the conflicts that come into focus around the Web Consortium, the Web and its commercialization. His puzzlement is a bit sad.
1 The Constitution of the United States, Article I, part 8.
2 Time, Jan. 24, 2000. pg. 38 and following:
AOL was America On Line. It provided million of users with a very friendly imitation of the Internet before the Web was readily available. Once the WWW was established in the U. S. and Europe, AOL provided its customers full access to the entire Internet. In 1999, AOL bought Netscape (of Navigator fame) so it could offer its users a full fledged browser.
3 Keep in mind that for a mere $50 billion you could buy every single outstanding share of General Motors!
4 Adam C. Engst includes much valuable history in the 3rd edition, Internet Starter Kit for the Macintosh, Hayden Books, 1995
5 Weaving the Web, pg. 13
6 Weaving the Web, pg. 159