After the dot-com crash in 2001, the tech world needed a few years to regroup. But starting around 2004, the year Facebook was founded and Google went public, the winds of innovation in consumer- and business-facing technology began to pick up again. In 2007 or so, they reached hurricane speed, and minus a short lull for the Great Recession, we’ve been buffeted by continuous change ever since, with the biggest advances coming in the overlapping areas of mobile, social, and cloud computing.
But for all its power, this storm was initiated by a surprisingly small group of players. Just three companies—Google, Apple, and Facebook—generated most of the new ideas (at least the mainstream ones) and most of the business momentum. (If I had more room and time, I’d work Amazon into the argument, but as a technology company, it ranks well behind the other three.) It’s been this way for almost a decade now, meaning it’s becoming harder and harder to imagine change coming from any other source.
That’s why journalists hang on every word from Larry Page, Tim Cook, or Mark Zuckerberg, and it’s why there’s perennial hand-wringing in the media about whether the Next Big Thing from Google, Apple, or Facebook—be it Google Glass, or the seventh iteration of iOS, or a new Android home screen populated by chat heads—is really as big as the Last Big Thing. Any sign that the giants might be faltering sends psychological shock waves through the whole high-tech culture, from venture investors to startup employees to eager technology consumers.
Well, at the risk of making myself into a pariah around Silicon Valley, I have a prediction to make. The storm has just about run its course. We have passed peak Apple, peak Google, and peak Facebook.
By which I mean: Apple will never again come out with a product as transformative as the iPhone. Google will never build anything more useful than its existing search engine, and it will never discover another business model as lucrative as search-based advertising. And Facebook may keep growing until every person on Earth with a computing device is a member, but it won’t ever be anything more than a place we share photos and links.
In sum, the next major advances in technology—the ones that will power the next cycle of entrepreneurship in Silicon Valley and the nation’s other tech hubs—will have to come from somebody else. To switch metaphors, the car is already out of gas; we just think there’s still forward progress, because we haven’t coasted to a stop quite yet. But we will. Chances are we’ll locate another gas station eventually—we always do. But before that happens, we may spend some time stuck by the roadside with the hood up, as we did in 2001-2004.
I’ll explain the thinking behind this prediction in a second. But first, let me be clear: I don’t think the petering out of the Google-Apple-Facebook triumvirate is cause for panic. In fact, it’s probably a good thing. No one should be allowed to lead for too long, or they get lazy and selfish. And a healthy innovation ecosystem needs a broader base than the one we have now. (That’s why I’ve been on the record rooting for Microsoft in the mobile wars.)
A situation in which power is spread between three companies was a big improvement over previous eras of computing, when a single company tended to dominate (for decades it was IBM, then it was Microsoft). In the next era, we may see something more like a true republic of technology, with no single company or group of companies possessing enough power to push others around and set the whole agenda for growth, the way Apple and Google have done in the mobile business with iOS and Android. That would be a good thing.
Now to the core of my argument. Here are three reasons why it’s a bad bet to expect any more game-changing innovations from Google, Apple, or Facebook.
1. Regression to the mean. Compared to the other companies in their cohorts, Apple, Google, and Facebook are all extreme outliers—the richest of the rich. They are the 1 percent; they’ve performed several standard deviations above the mean. Plain old statistics dictates that performance this good is usually followed by a dropoff in the direction of mediocrity.
You don’t reach this level of success by coming up with just a single earthshaking innovation—it usually takes two or more. Let’s spell it out. Google’s two fundamental innovations—now long behind it—were 1) if you look at the Web as a network of trust, defined mainly by links, you can use math to surface the best content, and 2) if you put related ads next to that content, people will click on them.
Facebook’s innovations were 1) if you make it super-easy for people to post photos and status updates to their feeds or timelines, it will give their friends a convenient way to feel like they’re staying in touch, creating a network that grows virally, and 2) if you make your identity system into a platform—something resembling single sign-on for the whole Web—you’ll have endless fuel for the feed.
Apple’s innovations were 1) if you pay attention to style, design, and ease of use, you can build computers that people will love, because they make work feel a little bit like play, and 2) if you marry this design sensibility with wireless technology and digital content like music, movies, TV shows, books, and games, it’s like putting a match to a pile of paint-soaked rags; you get a true mobile-computing explosion.
The chances that any of these companies (or indeed any given company) will come up with a third, equally earthshaking insight in the future are low. We have a tendency to assume that preternaturally lucky people are preternaturally smart, and that their luck will continue, but those are errors built into the way people think; psychologist Daniel Kahneman calls them “the illusion of skill” and “the illusion of pattern.” It would be safer and more rational to assume that somebody else will get lucky next time, and that Google, Apple, and Facebook’s best days are behind them.
2. True paradigm shifts don’t come along very often. If you took a sociology or history-of-science class in college, you probably came across Thomas Kuhn’s book The Structure of Scientific Revolutions. Kuhn’s basic argument is that science progresses by fits and starts. In any given discipline, such as physics or biology, each era is characterized by a reigning dogma or paradigm that stays in place until researchers have collected so many anomalies, flaws, and exceptions that the old dogma needs to be thrown out. As soon as somebody comes up with a new explanation that accounts for the anomalies, everything changes overnight. Between these paradigm shifts are long periods of what Kuhn calls “normal science” or “mopping up,” when practitioners are busy fitting observations into the reigning theory.
That’s what’s going on right now in fields like search, social networking, and mobile computing. The market is still absorbing and adjusting to the shifts that Google, Facebook, and Apple introduced in the 2000s. In search, Google is still fiddling around to find the right mix of statistical and human-curated results to give users the information they need and maximize click-through rates on ads. In social networking, perfecting the look and feel of news feeds and mobile interfaces seems to be more than enough to keep Facebook’s engineers and designers occupied.
And very importantly, the whole world of enterprise computing is busy absorbing the infrastructure technologies that Google, Facebook, and to some extent Amazon needed to invent to make their own products work—namely, distributed, flexible, scalable processing and storage technologies like MapReduce, Hadoop, EC2, and S3. Arguably, that’s what the whole software-as-a-service or cloud-computing movement is about, and it’s a huge side bonus of the search and social-networking revolutions.
As for Apple: it pains me to say this, given that I’m a big fan, but the company’s presentations at this week’s Worldwide Developer Conference confirmed that the bar for “innovation” in Cupertino is dramatically lower than it was a few years ago. The most interesting thing about OS X Mavericks, the newest version of the operating system for the Mac, is that it wasn’t named after a big cat. The new Mac Pro is supposed to be cool because it’s a black cylinder instead of an aluminum box. On the iOS side, the references to virtual cows and green felt showed how much energy has been diverted inside the company into debates over flat vs. skeuomorphic designs. That’s all classic mopping-up stuff. [Corrected 6/15/13: an earlier version of this paragraph stated that the case of the new Mac Pro is made of plastic. It's aluminum.]
In science, the mopping-up phase can go on for a very long time, especially if the preceding paradigm shift was particularly dramatic. Look at relativity and quantum mechanics in physics, or the discovery of the genetic code in biology; many decades later, we’re still dealing with those breakthroughs. To a large extent, Apple, Google, and Facebook are now victims of their own success—there’s so much room left for minor innovations within the ruling idea that they probably aren’t too concerned about, or even aware of, the flaws, anomalies, and limitations that eventually surface in every paradigm.
3. The innovator’s dilemma is real. Clay Christensen was right—it’s very difficult for large, established companies to keep innovating, because after a certain point, they can’t muster the courage to disrupt their existing revenue streams. In fact, as a tech journalist who’s been watching the field for almost 20 years now, I’m tempted to say that innovation only comes from startups, period.
Some of that is just by definition. After all, the whole job of an established company—especially one that’s accountable to the public markets—is to scale an existing business model as far as it can possibly be scaled. A startup, at least if you buy Steve Blank’s definition, is an organization designed to search for a scalable model—and given that this process is so full of difficulties and disappointments, it’s no surprise that companies stop searching as soon as they find one.
The only thing that keeps me from making the blanket statement that innovation only comes from startups is the occasional counterexample. Apple itself provided a stunning one in 1984 when it introduced the Macintosh, and another in 2007 when it introduced the iPhone. But generally, big companies can only reinvent themselves this way if they’ve got iconoclastic leaders powerful enough to sponsor skunk-works operations that are deliberately isolated from the larger company.
Apple lost its chief innovator when Steve Jobs died. Google may still have one in the form of Sergey Brin—but it’s unclear, since his skunk-works operation, GoogleX, appears to be more of a hobby than a business. Things like self-driving cars, wearable displays, airborne wind turbines, and space elevators might eventually lead to big shifts in the economy, but it’s hard to see how they will affect Google’s operations anytime soon.
The point is, the characters in our story have grown well past the point where most companies are able to stay innovative. Apple is 37 years old and has 50,000 employees, and its last major innovation, the iPad, came out in 2010. Google is 15 years old and has 39,000 employees (54,000 if you count Motorola), and its last major innovation—AdWords—was way back in 2003. All of the big changes at Google since then have been the result of acquisitions (e.g., Android). Facebook is 9 years old and has north of 4,600 employees, and its last big innovation push, the Facebook Platform, came in 2007. It’s safe to say that if these companies were going to introduce anything else that’s truly innovative, they would have done it already.
Just to reduce the risk of misunderstanding, let me be clear about what I’m not saying.
I’m not saying that Google, Apple, or Facebook will disappear anytime soon. I’m sure they’ll all be around for a long time—decades, probably—and that they’ll continue to play big roles in the lives of technology consumers, not to mention the lives of entrepreneurs and investors in Silicon Valley (where they’ve become the prime M&A engines providing exits for smaller startups). They will collect, process, and store a growing portion of our personal data, and they’ll continue to flesh out the universe of info-gadgets until we’ve got devices of every size and form factor—from big HDTV screens to wristwatches, and from pocket touchscreens to wireless contact lenses.
I’m also not making a Francis Fukuyama-style, end-of-history argument. There is plenty of room left for innovation in computing, even if (as many are now predicting) Moore’s Law sputters out after another generation or two of advances in chip design. There is huge, unrealized promise in areas like artificial intelligence, simulation and modeling, immersive digital entertainment, frictionless commerce between makers and entrepreneurs around the world, and the automation of things humans aren’t very good at, like driving or microsurgery.
I’m just saying that the next few paradigm shifts in information technology probably won’t be spearheaded by the same players who brought us the last few. The tech media’s obsession with Apple, Google, and Facebook is understandable—-they provide plenty of drama and suspense, and they’re all masters at ginning up publicity around their latest incremental product releases—but it’s ultimately shortsighted.
Investors on Wall Street seem to agree with me, at least when it comes to Apple and Facebook. Apple’s stock peaked around $700 in September 2012. Facebook never lived up to expectations; its share price is down 38 percent since its May 2012 IPO. Google has managed to do better—its stock price is currently hovering around its all-time high of more than $900, which reflects its position as the strongest of the triumvirate (after all, it’s still got an unrivaled position in the search, advertising, and mobile operating system markets).
Here’s one big implication: It’s time for the real exodus at Google, Apple, and Facebook to begin. Silicon Valley thrives on the recycling of talent—it’s been that way ever since the “traitorous eight” left Shockley Semiconductor to start Fairchild in 1957. The hundreds of Facebook employees who became millionaires after the company went public aren’t doing themselves or their company any favors by sticking around. They should go out and start new companies to find the next paradigm right now, while most of their peers are preoccupied with mop-up work. The rewards, for them and for consumers, could be fabulous.
[Update, 6/20/13: Hundreds of readers have shared their reactions to this column via Slashdot, Reddit, Hacker News, and Xconomy's own comment section, below. A few readers thought I hit the mark; most seem to think I'm an idiot. In any case, I've put some of the most interesting comments together into this handy summary.]
By posting a comment, you agree to our terms and conditions.