By
Steven Shapin
Oct. 3, 2014 5:37 p.m. ET
Walter Isaacson’s last book was the best-selling biography of Steve Jobs —the charismatic business genius of Apple Computer and one of the beatified icons of modern technology and entrepreneurship. Mr. Isaacson’s fine new book, “The Innovators,” is a serial biography of the large number of ingenious scientists and engineers who, you might say, led up to Jobs and his Apple co-founder Steve Wozniak —“forerunners” who, over the past century or so, produced the transistor, the microchip and microprocessor, the programmable computer and its software, the personal computer, and the graphic interface. These in turn were among the technological conditions for the videogame, the Internet and Web, the search engine, the online crowd-sourced encyclopedia and the ability to use a touchscreen to hurl spherical birds into buildings and make them explode.
Taken together, this is what is called the Digital Revolution, built on an originating vision of machines that could calculate but expanding to envisage digitized mechanisms handling anything that could be symbolically represented—logical operations, words, images, signals and sounds. The digital vision looked into the future and saw unimagined possibilities of representing, recombining and communicating information—and doing so fast, robustly, cheaply and globally. The Scientific Revolution of the 17th century had changed the way that a small group of elite thinkers saw the world, but most of the world’s population still doesn’t think the way scientists do. The Digital Revolution, by contrast, has changed many things for all of us. There’s no overestimating the shattering effects of these technologies on our economy, our culture, our forms of interaction and our sense of who we are.
Some people call this the Third Industrial Revolution—the first based on coal, steam and iron; the second on steel, electricity and mass production; and the last on electronic computers and information technology. Each revolution has its heroes, but Mr. Isaacson here reckons that the biographical genre exaggerates the contributions of individuals and vastly underestimates incremental improvements over time and the creative interactions that individuals have with one another. So Mr. Isaacson’s task in “The Innovators” is different from the one in “Steve Jobs”: It is to tell the story of how the Digital Revolution happened; to tell it through the accomplishments of many individuals, some of whom were immensely clever and visionary; and, at the same time, to throw cold water on heroic biography, the attempt to identify the lone geniuses who saw the future, whole and complete, just waiting to be brought into being.
To his credit, Mr. Isaacson periodically suggests that we shouldn’t be thinking of lone geniuses, or even of self-sufficient imaginative individuals, at all but of the organizational forms within which innovation takes place. The book is historically organized by chapters on specific digital technologies—“The Computer,” “Programming,” “The Microchip,” “Video Games,” “The Internet,” etc.—and, within each chapter, by accounts of the individuals who made significant contributions. But it’s most effective when it gets to grips with creative teams—groups whose ideas arose from exchanges among its members and whose inventiveness flowed from their differences in knowledge, skills, styles of working and temperament.
So Mr. Isaacson draws attention to organizations that, for a time, hosted groups that were more than the sum of their individual parts. At the great “idea factory” that was AT&T’s Bell Labs during and after World War II, hundreds of scientists and engineers, representing a wide range of disciplines, were given a relatively free hand to explore and improvise. The physicists John Bardeen and Walter Brattain, aided (and at times annoyed) by their abrasive colleague William Shockley, formed a team meeting practically every day, “a quintessential display of finish-each-other’s-sentence creativity.” They created the transistor, the fundamental building block for the microprocessor and what has been called the most important invention of the 20th century.
The pattern is recurrent: Creative, whole-more-than-sum-of-parts teams were crucial to digital innovation at Intel, the key company in the development of the microprocessor industry, and at Xerox PARC, probably the single most fertile source of electronic innovations in the 1970s: ethernet, the graphical user interface and the mouse. Steve Jobs told Mr. Isaacson that his “role model” was Robert Oppenheimer, who at wartime Los Alamos so effectively found ways of getting scientists with radically different skills and personalities to work together in designing the atomic bomb.
Teams innovate, but Mr. Isaacson also attributes a powerful role to what he calls “ecosystems.” The Silicon Valley ecosystem included venture capital, without which no digital innovation would ever have become a commercial reality. It also included the universities at Berkeley and, especially, Stanford, which licensed the intellectual property produced by its professors and students, shared its faculty and facilities with the new industries, and fed the pipelines from which emerged a constant flow of computer scientists and engineers. Most especially, the Silicon Valley ecosystem included the 800-pound gorilla that often goes unmentioned by libertarian, I-did-it-all-myself digital entrepreneurs—the Cold War military establishment which provided customers and funding for the early computer and microprocessor industries and later sponsored the development of the Internet.
At the opposite political pole from the Pentagon, the Bay Area ecosystem spurring digital innovation contained Homebrew hackers, hippies, hobbyists, Whole Earthers, Free Speech activists and radical community organizers who saw the personal computer and the emerging Internet as potential sources of Power to the People. Stoned in Golden Gate Park, listening to the Grateful Dead, some counter-culturists were inspired to turn north to Marin County and form mantra-chanting, organic-farming communes; others turned south to Palo Alto and helped found the personal-computer industry and build the Internet. The Pentagon and the anti-Vietnam War radicals had different visions of how digital technology should develop, but they were oddly coupled together in inspiring the Digital Revolution.
That revolution has been energized by its visionaries—prophets and seers mobilizing followers to realize their vision of a world re-made by new technologies. Some of the prophecies were self-fulfilling: Gordon Moore ’s “Law” predicting the doubling of a microprocessor’s power every year and a half focused energies on a goal that was authoritatively said to be attainable. Before Microsoft was founded, the Harvard undergraduate Bill Gates was considering whether to develop a programming language for Intel’s new 8088 processor. He rejected the enthusiasm of his colleague Paul Allen because the proposed version of BASIC programming language would take up almost all of the chip’s memory. Moore’s Law assured them that a more powerful microprocessor would surely appear within a year or two, so why not put that project aside and take up other lines of work?
After Microsoft was incorporated, Mr. Gates had another compelling vision about the future of the computer industry. He foresaw that hardware would soon become commodified, generating only low profits, while software companies would eventually rule the commercial roost. The crucial thing wasn’t to maximize short-term profits but to do what you could to make your software an industry standard. That vision shaped Microsoft’s decision to buy in—at a scandalously low price—what eventually became the DOS operating system and then to peddle it to IBM on a nonexclusive license, allowing Mr. Gates’s company to license it on to manufacturers of PC clones. At Apple, Jobs too was a great prophet: His aesthetic and technological vision foresaw hardware and software bound together in a sealed, integrated system, and that vision propelled the development of the Mac and its successors. Even before the iPod and the iPhone, Apple was a profitable company, but, so far, the future seems to have voted for Microsoft’s vision.
There is an understandable temptation to write history backwards, inferring the sufficient causes of success from some characteristics shared by successful enterprises. But most visions of the technological and commercial future turn out to be wrong; most entrepreneurial businesses fail; most teams put together following any formula for innovation turn out disappointingly. And there’s plenty of vision involved in deciding what lines of development not to pursue, what product lines to abandon. It’s clearly a good thing to be a visionary, but it’s a lot better to have the future prove you right. One of the truly remarkable features of digital revolutionaries was the confidence they had in their own techno-prophecy. A popular saying at Xerox PARC was that “the best way to predict the future is to invent it.”
Mr. Isaacson’s heart is with the engineers: the wizards of coding, the artists in electrons, silicon, copper, networks and mice. But “The Innovators” also gives space to the revolutionary work done with men as well as mice: experiments in the organizational forms in which creativity might be encouraged and expressed; in the aesthetic design of personal computers, phones and graphical fonts; in predicting and creating what consumers did not yet know they wanted; and in the advertising and marketing campaigns that make them want those things. Not the least of the revolutionaries’ inventions was their own role as our culture’s charismatic prophets, uniquely positioned to pronounce on which way history was going and then to assemble the capital, the motivated workers and the cheering audiences that helped them make it go that way.
—Mr. Shapin‘s books include “The Scientific Life: A Moral History
of a Late Modern Vocation.”
iReader iReader Logo