The Innovators

By Walter Isaacson

Some of us can still remember a time when the Internet was being dismissed as a lot of hype over nothing.

Those days are gone. The Internet was actually bigger than advertised, and has had an impact that may be turn out to be greater than that of the industrial revolution. It truly changed everything: culture, economics, politics, and even how we think and relate to one another.

And it all happened so fast: from the invention of the first computer to an entirely wired world in a single lifespan.

The digital revolution had many parents, including, yes, Al Gore. In The Innovators Walter Isaacson provides a full accounting of how it happened. This involves both unpacking the Internet’s genealogy as well as accounting for all of its component parts and industries: computers, microchips, transistors, software, personal computers, and more.

It is a giant undertaking. A great deal of complex science and technology has to be very quickly explained, and there are nearly a hundred pocket biographies of key innovators to get through along the way.

By taking a general approach Isaacson highlights how historical forces, individual talent, and technology all had to come together to create the current digital dispensation. This means the book sometimes reads more like a reference work than a narrative history, but it nevertheless provides an accessible overview of a subject the intricacies of which are beyond the ken of most of us. And it also allows Isaacson to conduct an inquiry into a matter of special interest: Where did all this innovation come from?

Isaacson presents two models for innovation: solitary creative genius and collaborative teamwork. He argues both are necessary, but in the end he is clear that collaboration is the most important. “Innovation is not a loner’s endeavor.” Scientific breakthroughs were of course necessary, but engineers were also needed to put the big ideas of the digital revolution into practice, and business savvy was required to bring the product to market.

The revolution was underwritten by necessity and demand. Whether development was being undertaken by the military, universities, or corporations (the so-called “iron triangle” of the military-industrial-academic complex), what we got was a revolution we wanted, coming in the form of ever smaller, cheaper, faster, and more attractive devices.

This is an important point. There can be different paths to the same innovation: the basic science and many of the essential developments of the digital revolution (like the computer and the microchip) were arrived at nearly simultaneously in various places. But the direction innovation was driven in was largely the result of consumer choices. As has been observed before, technology opens a door but doesn’t force us to go through. Consciously or not, we build our own future, and direct our own social evolution.

Isaacson’s interest is in the history of the digital revolution and not where it may be taking us. Nevertheless, he does try to project some of the trends he examines into the future.

What he would like to see going forward is a digital environment strengthened by human factors: “values, intentions, aesthetic judgments, emotions, personal consciousness, and a moral sense.” In other words, a wedding of C. P. Snow’s “two cultures” of science and the humanities, linking “beauty to engineering, humanity to technology, and poetry to processors.”

That is a consummation devoutly to be wished. Current trends, however, are not promising. Indeed, things seem to keep moving in another direction entirely.

The cultural economy has been severely disrupted by the digital revolution, and despite the best efforts of many concerned experts in the field (a couple of recent survival guides being Jaron Lanier’s Who Owns the Future? and Cory Doctorow’s Information Doesn’t Want to Be Free) there are few clear or convincing ideas on the table for how a creative class will be able to sustain itself in the future. Going viral is a windfall, not a career. University enrollment in the arts continues to decline, as the squeezed middle class reads the tea leaves and sees the humanities as a professional dead end.

Still, innovation can take many forms, and the “human factor” is adaptable. Our technology has been speeding ahead of us, but we may have time yet to catch up to our machines.

Review first published in the Toronto Star December 26, 2014.