The history of 'The Innovators' tells us that collaboration is core to innovation

walterisaacson.jpg
walterisaacson.jpg

"The Innovators" author Walter Isaacson

 Image courtesy of Simon & Schuster

The digital era and its discontents occupy a large part of the screens of humanity these days. Attention is a currency, attracted, bought, and spent on billions of mobile devices and computers, used to raise awareness of current events or market new products and services. Over the past month of my travel, however, I've enjoyed diving back into history in "The Innovators," Walter Isaacson's terrific new book about the people behind the digital revolution that has changed how we work, live, govern, and play.

I picked up a free copy of the book at a talk at Google's headquarters in Washington, DC, where Isaacson interviewed two pioneers who helped build the internet -- Vint Cerf and Bob Kahn -- about their place in history. You can watch their discussion in the video below. (I posed a question at about 70 minutes in, asking whether the US was still creating the conditions for innovation that Isaacson described. All three men answered in the affirmative.)

While Isaacson spends some time telling the relevant recent stories of Wikipedia, Google, and the web, the bulk of the book is an eminently readable survey that moves from the dawn of computing to artificial intelligence. As is the case with so many of Isaacson's books, the prose is lucid, approachable, and concise. He's an excellent synthesizer of ideas and concepts, combining firsthand quotes from the surviving protagonists who built precursors and protocols underneath the information technology systems you're using to read this column with research and citations from previous works. Isaacson makes technology approachable to a layperson but has enough technical detail to be interesting to engineers. If you or a someone you love is interested in the history of IT, I highly recommend the volume for a gift.

After finishing the book, I retained several top-level insights that I think are relevant to anyone focused upon building technology and teams today -- or simply seeking to understand why some may succeed and others do not. In general, Isaacson says that successful technology innovation requires having at least three things: "a great idea, the engineering talent to execute it, and the business savvy (plus deal-making moxie) to turn it into a successful product."

theinnovatorsisaacson.jpg
theinnovatorsisaacson.jpg

 Image courtesy of Amazon.com

In no particular order, here are the conclusions Isaacson drew from a century of revolutionary and evolutionary change.

First, innovation is a collaborative process. "Innovation comes from teams more than lightbulb moments of lone geniuses," said Isaacson. The computer, the ARPANET, and the internet were all designed by collaborative teams. The latter two innovations enabled people to collaborate not only within teams but with strangers across regions, countries, and time zones.

Second, innovation is based upon the collection of ideas that came before: "Collaborations are not merely among contemporaries but also between generations," wrote Isaacson. For instance, the World Wide Web was created by combining together two previous innovations: hypertext and the internet.

Third, physical proximity is beneficial, despite the immense utility of the internet as a tool for distributed collaboration.

"Like Xerox PARC and other corporate research satellites that followed, Bell Labs showed how sustained innovation could occur when people with a variety of talents were brought together, preferably in close physical proximity where they could have frequent meetings and serendipitous encounters," writes Isaacson.

Fourth, the best leadership has complementary styles, historically, including inspiring visionaries, brilliant engineers, good managers, and great collaborators. Similarly, the most productive teams are diverse ones, with a wide range of specialties. Historically, the most successful endeavors he listed all combined clear vision with leadership that fostered collaboration. Notably, the successful innovators listed in the book were primarily "product people," which may be a necessity in creating technology that not only works but is accessible to billions of people.

Isaacson shared several examples of teams that pair visionaries with great engineers. For instance, Robert Noyce and Gordon Moore needed Andy Grove at Intel. Visionaries without great teams all too often end up as historical footnotes. (In this book, that was literally the case.) "...the transistor was one of the most important discoveries of the twentieth century," wrote Isaacson. "It came from the partnership of a theorist and an experimentalist working side by side, in a symbiotic relationship, bouncing theories and results back and forth in real time."

Fifth, Isaacson outlined three models for creating teams in the digital era: government funding and coordination (ENIAC/ARPANET), private enterprise (Bell Labs, Xerox PARC, Intel, Atari, Google, Microsoft, Apple), and peer-based commons production (Wikipedia, World Wide Web, Linux, Firefox).

"In the cases of the Internet, the Web, and some forms of software, the open model would turn out to work better," wrote Isaacson. "But when it came to hardware, such as computers and microchips, a proprietary system provided incentives for a spurt of innovation in the 1950s."

Today, "sometimes people advocate one of these modes of production over the others based on ideological sentiments," observed Isaacson. "They prefer a greater government role, or exalt private enterprise, or romanticize peer sharing."

Finally, embracing creativity and tinkering are central to innovation. In his answer to me and in the book, Isaacson noted that the roots of personal computing lie in the San Francisco Bay area and several countercultural trends in the 1970s, from hippies to New Left activists to Whole Earth communalists.

"As different as some of these tribes were from each other, their world intermingled and they shared many values," he wrote. "They aspired to a do-it-yourself creativity that was nurtured by building Heathkit radios as kids, reading the Whole Earth Catalog in college, and fantasizing about someday joining a commune. Ingrained in them was the very American belief, so misunderstood by Toqueville, that rugged individualism and the desire to form associations were totally compatible, even complementary, especially when it involved creating things collaboratively. The maker culture in America, ever since the days of community barn raisers and quilting beers, often involved do-it-ourselves rather than do-it-yourself."

As Isaacson explored in his well-regarded biography of Steve Jobs and in this tome, this was the historical and cultural context for Steve Wozniak's creation of the Apple I and the Homebrew Computing Club. "The Innovators" retraces well-worn ground, recounting

the three major innovations at Xerox PARC that Steve Jobs saw on his first visit there: Ethernet, object-oriented programming, and a graphical user interface (GUI) made possible by bitmapping pixels onto a screen that used a desktop metaphor and a mouse-controlled cursor. The latter is what inspired him, leaving Jobs thinking that he "could see what the future of computing was destined to be."

Over the years, Jobs has been criticized for stealing Xerox's ideas. Isaacson argues that such criticism doesn't properly credit the genuine innovation that followed at Apple.

"What really matters is execution," he wrote. "Jobs and his team took Xerox's ideas, improved them, implemented them, and marketed them. Xerox had the chance to do that, and they in fact tried to, with a machine called the Xerox Star. It was clunky and kludgy and costly, and it flopped. The Apple team simplified the mouse so that it has only one button, gave it the power to move documents and other items around the screen, allowed file extensions to be changed just by dragging a document and 'dropping' it into a folder, created pull-down menus, and allowed the illusion of documents piling on top of each other and overlapping."

If that sounds familiar, it's because the next stage in the personal computing revolution is well known to hundreds of millions of people around the world: Bill Gates and Microsoft created their own operating system with a graphical user interface, called it Windows, and licensed it to IBM. By 2000, Windows was the operating system for 95% of the desktop and laptop computers in the world.

What's happened with mobile operating systems since the introduction of the iPhone in 2007 is another story with a similar narrative arc that isn't in the book, albeit with a different player and a stronger position for Apple. While the open source Android dominates the global market for operating systems, the sales of hundreds of millions of iPhones and iPads with a tightly integrated iTunes ecosystem for apps, movies, and music has led Apple to be the biggest tech company in the world.

As Isaacson notes, however, by the 1990s, there were at least three models for software development:

  1. the Apple approach, where hardware and software are tightly bundled and proprietary.

  2. the Microsoft approach, where a proprietary operating system is unbundled from the hardware.

  3. a free and open software approach, where software can be freely used, modified, and shared.

On that last count, Isaacson recounts how Linus Torvalds' "release of the Linux kernel led to a tsunami of peer-to-peer volunteer collaboration that became a model of the shared production that propelled digital age innovation."

Crucially, Isaacson grounds the success of that open source model in both history and human nature.

"Peer-to-peer collaboration and commons-based collaboration were nothing new," he wrote. "An entire field of evolutionary biology had arisen around the question of why humans, and members of some other species, cooperate in what seem to be altruistic ways. The tradition of forming voluntary associations, found in all societies, was especially strong in early America, evidenced in cooperative ventures ranging from quilting bees to barn raisings."

On that count, Isaacson cited Ben Franklin's autobiography on an entire "civic creed," with the motto "to pour forth benefits for the common good is divine," noting associations that would create "a hospital, militia, street-sweeping corps, fire brigade, lending library, night-watch patrol, and many other community endeavors."

In a conclusion that may disappoint some vocal members of tribes in the technology world, Isaacson's history doesn't come down on only one side of open or closed systems. Rather, he recommends their combination.

"Each model has its advantages, each had its incentives for creativity, and each had its prophets and disciples," he wrote. "But the approach that worked best was having all three models coexisting, along with various combinations of open and closed, bundled and unbundled, proprietary and free. Windows and Mac, UNIX and Linux, iOS and Android: a variety of approaches competed over the decades, spurring each other on -- and providing a check against one model becoming so dominant that it stifled innovation."

In general, Isaacson says that the lesson of modern economics that applies to digital-age innovation is that a "combination of all of these ways of organizing production -- government, market, and peer sharing -- is stronger than favoring any of them." This balance is a thread that is strong throughout the book and offers something of a quiet corrective to woefully ahistorical versions of the internet's history:

"The Internet was built partly by the government and partly by private firms, but mostly it was the creation of a loosely knit cohort of academics and hackers who worked as peers and freely shared their creative ideas. The result of such a peer sharing was a network that facilitated peer sharing. This is not mere happenstance. The Internet was built with the belief that power should be distributed rather than centralized and that any authoritarian diktats should be circumvented."

Beyond the cultural underpinnings and standards of the network of networks that we use today is an important insight into how it all came to be.

"The creation of a triangular relationship among government, industry and academia was, in its own way, one of the significant innovations that helped produce the technological revolution of the late twentieth century," wrote Isaacson. "The Defense Department and the National Science Foundation soon became the prime funders of much of America's basic research, spending as much as private industry during the 1950s through the 1980s*. [*By 2010, federal spending on research had dropped to half of what was spent by private industry.] The return on that investment was huge, leading not only to the Internet but to many of the pillars of America's post-war innovation and economic boom."

More recently, efforts like the BRAIN Initiative have sought to launch a modern day "moonshot," albeit with far less relative funding. Whether that kind of taxpayer support for basic research and development is worth the investment shouldn't be in question today, given what has come before, but it is.

"Over the course of three decades, the federal government, working with private industry and research universities, had designed and built a massive infrastructure project, like the interstate highway system but vastly more complex, and then threw it open to ordinary citizens and commercial enterprises," wrote Isaacson. "It was funded primarily by public dollars, but it paid off thousands of times over by seeding a new economy and era of economic growth."

It's frankly impossible to finish this book and not marvel at the genius, audacity, grit, ambition, generosity, and humanity that are behind decades of invention and advances that have led to us living in the future today. The innovations that are possible today and that are coming tomorrow would not be possible without the foresight, hard work, and generosity of generations of men and women over the last century. I can't wait to see what lies ahead.

Also see

Disclosure: TechRepublic and Simon & Schuster, the publisher of "The Innovators," are CBS brands.

Automatically subscribe to TechRepublic's The Editor's Daily Picks newsletter.