(Page 2 of 3)
the way people think; psychologist Daniel Kahneman calls them “the illusion of skill” and “the illusion of pattern.” It would be safer and more rational to assume that somebody else will get lucky next time, and that Google, Apple, and Facebook’s best days are behind them.
2. True paradigm shifts don’t come along very often. If you took a sociology or history-of-science class in college, you probably came across Thomas Kuhn’s book The Structure of Scientific Revolutions. Kuhn’s basic argument is that science progresses by fits and starts. In any given discipline, such as physics or biology, each era is characterized by a reigning dogma or paradigm that stays in place until researchers have collected so many anomalies, flaws, and exceptions that the old dogma needs to be thrown out. As soon as somebody comes up with a new explanation that accounts for the anomalies, everything changes overnight. Between these paradigm shifts are long periods of what Kuhn calls “normal science” or “mopping up,” when practitioners are busy fitting observations into the reigning theory.
That’s what’s going on right now in fields like search, social networking, and mobile computing. The market is still absorbing and adjusting to the shifts that Google, Facebook, and Apple introduced in the 2000s. In search, Google is still fiddling around to find the right mix of statistical and human-curated results to give users the information they need and maximize click-through rates on ads. In social networking, perfecting the look and feel of news feeds and mobile interfaces seems to be more than enough to keep Facebook’s engineers and designers occupied.
And very importantly, the whole world of enterprise computing is busy absorbing the infrastructure technologies that Google, Facebook, and to some extent Amazon needed to invent to make their own products work—namely, distributed, flexible, scalable processing and storage technologies like MapReduce, Hadoop, EC2, and S3. Arguably, that’s what the whole software-as-a-service or cloud-computing movement is about, and it’s a huge side bonus of the search and social-networking revolutions.
As for Apple: it pains me to say this, given that I’m a big fan, but the company’s presentations at this week’s Worldwide Developer Conference confirmed that the bar for “innovation” in Cupertino is dramatically lower than it was a few years ago. The most interesting thing about OS X Mavericks, the newest version of the operating system for the Mac, is that it wasn’t named after a big cat. The new Mac Pro is supposed to be cool because it’s a black cylinder instead of an aluminum box. On the iOS side, the references to virtual cows and green felt showed how much energy has been diverted inside the company into debates over flat vs. skeuomorphic designs. That’s all classic mopping-up stuff. [Corrected 6/15/13: an earlier version of this paragraph stated that the case of the new Mac Pro is made of plastic. It's aluminum.]
In science, the mopping-up phase can go on for a very long time, especially if the preceding paradigm shift was particularly dramatic. Look at relativity and quantum mechanics in physics, or the discovery of the genetic code in biology; many decades later, we’re still dealing with those breakthroughs. To a large extent, Apple, Google, and Facebook are now victims of their own success—there’s so much room left for minor innovations within the ruling idea that they probably aren’t too concerned about, or even aware of, the flaws, anomalies, and limitations that eventually surface in every paradigm.
3. The innovator’s dilemma is real. Clay Christensen was right—it’s very difficult for large, established companies to keep innovating, because after a certain point, they can’t muster the courage to disrupt their existing revenue streams. In fact, as a tech journalist who’s been watching the field for almost 20 years now, I’m tempted to say that innovation only comes from startups, period.
Some of that is just by definition. After all, the whole job of an established company—especially one that’s accountable to the public markets—is to scale an existing business model as far as it can possibly be scaled. A startup, at least if you buy Steve Blank’s definition, is an organization designed to search for a scalable model—and given that this process is so full of difficulties and disappointments, it’s no surprise that companies stop searching as soon as they find one.
The only thing that keeps me from making the blanket statement that innovation only comes from startups is the occasional counterexample. Apple itself provided a stunning one in 1984 when it introduced the Macintosh, and another in 2007 when it introduced the iPhone. But generally, big companies can only reinvent themselves this way if they’ve got iconoclastic leaders powerful enough to sponsor skunk-works operations that are deliberately isolated from the larger company.
Apple lost its chief innovator when Steve Jobs died. Google may still have one in the form of Sergey Brin—but it’s unclear, since his skunk-works operation, GoogleX, appears to be more of a hobby than a … Next Page »
By posting a comment, you agree to our terms and conditions.