Nine Lessons for Innovators from a Nobel Prize-Winning Psychologist

Nine Lessons for Innovators from a Nobel Prize-Winning Psychologist

I was pretty slow about getting around to reading Thinking, Fast and Slow. The career-capping book by Princeton psychologist Daniel Kahneman, one of the founders of behavioral economics, spent months on all the bestseller lists back in 2011. I finally picked up a paperback copy a couple of weeks ago.

The book is mainly about the limits of intuition and the biases—seemingly built into the way the human mind has evolved—that keep us from acting in accord with logic, rationality, and statistics. For example, the “anchoring effect” means we’re highly suggestible when it comes to numbers and prices. No matter how much soup they really want, grocery-store customers buy more when there’s a sign saying “Limit 12 cans per customer.”

It was Kahneman’s studies of such impulses that won him a Nobel Prize in economics in 2002. Together with Amos Tversky, Richard Thaler, and many others, Kahneman overturned economists’ old picture of society as a mathematical utopia in which individuals act rationally to maximize their own utility. Thinking, Fast and Slow is a lengthy, detailed, yet approachable summary of that revolution. You’ve probably seen or read popular behavioral-economics boks like Steven Levitt and Stephen Dubner’s Freakonomics or Dan Ariely’s Predictably Irrational; those are like sugary desserts next to Kahneman’s protein-packed tome.

As I absorbed Kahneman’s points about all the ways human judgment breaks down under various stresses and distractions, I couldn’t help looking for lessons that might apply to the high-tech circus we insiders sometimes call, in a self-congratulatory way, the “innovation ecosystem.” By which I mean the researchers and developers who incubate new technologies inside universities, corporate labs, and garages; the entrepreneurs who turn these new ideas into products; the angel and venture investors who place bets on the entrepreneurs; and the eager customers who fuel the whole process.

And such lessons abound. Indeed, entrepreneurs, executives, and investors are prone to so many kinds of errors that they almost seem to be Kahneman’s favorite subspecies of homo economicus. In the end, I think Kahneman’s observations about bias lead to a big puzzle about the nature of entrepreneurship and technological progress. But before I get into that, I’ll relate a few examples from the book—each one richly supported by the psychological experiments and surveys conducted by Kahneman and his colleagues over the last three or four decades:

1. The illusion of understanding: If we can fit past events into a satisfying story, we think we understand what really happened, and we can’t imagine things turning out any other way. Here Kahneman cites the example of Google, which was started by two Stanford graduate students who lucked into one of the biggest untapped markets in the history of business (i.e, search-based advertising) and came out looking like invincible geniuses. In fact, there were numerous points at which Google’s story could have taken a drastically different turn—such as 1999, when Page and Brin were willing to sell the company for $1 million but the buyer thought the price was too high. But luck took them in a different direction. “A compelling narrative fosters an illusion of inevitability,” Kahneman observes.

Thinking, Fast and Slow Book Cover

2. Outcome bias: Closely related to the illusion of understanding, this is the tendency to reward or blame decision makers for the performance of their organizations, even though the correlation between leadership quality and corporate performance is generally low. “We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact,” Kahneman writes. “Leaders who have been lucky are never punished for having taken too much risk. Instead, they are believed to have had the flair and foresight to anticipate success, and the sensible people who doubted them are seen in hindsight as mediocre, timid, and weak.”

3. The illusion of pattern: Kahneman thinks we’re too quick to ascribe meaning to events that are the product of pure chance. A basketball player who sinks three or four baskets in a row is seen as having a “hot hand,” and a CEO who oversees several successful product launches or acquisitions acquires a reputation for extraordinary insight or skill when in fact, like Page and Brin, he was probably just fortunate. “We are far too willing to reject the belief that much of what we see in life is random,” Kahneman warns.

4. Nonregressive explanations: An outstanding performance is likely to be followed by a mediocre performance. This isn’t backsliding: it’s usually just regression to the mean, the tendency of variables to gravitate around a historical average. The concept is well established, but because we’re hard-wired to seek causal rather than statistical explanations, we have a hard time accepting it. One corollary is that we shouldn’t punish a company that fails to follow up a stellar product with an even more stellar one (the iPad and its regrettable sequel, the iPad mini, come to mind). Another is that all extreme predictions are unreliable; we shouldn’t believe any entrepreneur who says his company is the next Google.

5. The illusion of validity, also known as the illusion of skill: We’re strongly influenced by the world in front of our eyes, and unwilling to admit that there’s much we don’t know—a phenomenon that Kahneman calls WYSIATI, for What You See Is All There Is. As a result, we come to believe—sometimes fiercely—that our own predictions are accurate, even when it wouldn’t take much digging to show that they’re little better than random guesses. For example, data from a firm of wealth advisors with whom Kahneman consulted showed zero year-to-year correlation in the success of individual advisors, meaning that the supposed talents for which these advisors were rewarded in their annual bonuses was entirely illusory. But when Kahneman presented these findings to the firm, he was politely ignored. “The illusion of skill is not only an individual aberration; it is deeply ingrained in the culture of the industry,” Kahneman concludes.

6. The optimistic bias: Entrepreneurs know that the chances of success for a new business are low—only 35 percent of small businesses in the U.S. survive for five years. But they don’t seem to think the statistics apply to them, or they’d probably never start. Closely related to this is the planning fallacy, which leads decision makers to neglect historical data and rely on forecasts that are really more like best-case scenarios. The optimistic bias may also explain why so many corporate mergers and acquisitions go bad: the leaders of the acquiring firm overestimate their own competence and “make huge bets…acting on the mistaken belief that they can manage the assets of another company better than its current owners do.”

7. Overconfidence: The illusion of validity leads decision-makers to undertake bold programs based mostly on their insularity. “Declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true,” Kahneman writes. But he also sees an institutional bias at work. Organizations and markets reward leaders who act with blind confidence. “An unbiased appreciation of uncertainty is a cornerstone of rationality—but it is not what people and organizations want,” he writes. “Acting on pretended knowledge is often the preferred solution.”

8. Competition neglect: Entrepreneurs usually act as if their companies will rise or fall based on their own efforts, while ignoring what their competitors are up to. “These bold people think their fate is almost entirely in their own hands,” Kahneman says. “They are almost surely wrong: the outcome of a startup depends as much on the achievements of its competitors and on changes in the market as on its own efforts. However, WYSIATI plays its part, and entrepreneurs naturally focus on what they know best—their plans and actions and the immediate threats and opportunities, such as the availability of funding. They know less about their competitors and therefore find it natural to imagine a future in which the competition plays little part.”

9. The focusing illusion: Business leaders are prone to many biases, but so are consumers—the people who ultimately buy the stuff entrepreneurs are selling. One is the focusing illusion, which can be reduced to the observation that, in Kahneman’s words, “nothing in life is as important as you think it is when you are thinking about it.” This illusion leads early adopters—and I count myself in this group—to spend lots of money on heavily hyped products that, in reality, are either off the wall (Google Glass) or only marginally better than what came before (the iPhone 5 vs. the iPhone 4S). “The focusing illusion creates a bias in favor of goods and experiences that are initially exciting, even if they will eventually lose their appeal,” Kahneman writes. This leads us to make bad choices based on overly optimistic forecasts of our future happiness—a pattern that Kahneman’s colleagues Daniel Gilbert and Timothy Wilson call “miswanting.”

These nine biases are just the beginning. The litany of thinking errors that Kahneman and others in his field have documented is so long it’s a wonder anything gets accomplished in the business world.

In fact, the farther I got into Kahneman’s book, the clearer it became that there’s a strange paradox at work. Kahneman says he wrote Thinking, Fast and Slow partly to help readers identify the biases in their own thinking and sidestep them in cases where they may lead to worse outcomes. He doesn’t argue that biases can be rooted out altogether—since, to a large extent, they’re neurologically hard-wired—but he thinks they can be mitigated through greater awareness.

The paradox is this: the whole world of venture-backed innovation is structured to reward people who take irrational gambles. In fact, you can argue that technological and economic disruptions of the sort that the tech ecosystem celebrates only come from people who display delusional levels of self-confidence and risk-taking. If every entrepreneur and investor were to read Kahneman’s book and become fully cognizant of the flaws in their thinking and the statistical realities they’re up against, Silicon Valley would have to put out a permanent “Gone Fishing” sign.

By the same token, if the people who donate money to startups on Kickstarter and buy Fitbits and Pebble watches and iPhones were to lose the conviction that each new gadget will make them incrementally happier, there’d be nobody to pave the way for wider adoption—and many genuinely worthy technologies might never see the light of day.

Kahneman himself seems to be aware of the contradictions. “I believe that someone who lacks a delusional sense of significance will wilt in the face of repeated experiences of multiple small failures and rare successes,” he writes. So the answer to the paradox may be that there are sectors of the economy where bias is not just desirable, but indispensable. Entrepreneurs are delusional. They’re looking at examples like Larry Page and Sergey Brin and Mark Zuckerberg and Kevin Systrom (Instagram’s founder) and—like buyers of Powerball tickets—fixating on a hyper-optimistic scenario in which they’re just as lucky.

But you know what? God bless ‘em all for for their obvious overconfidence. If the traitorous eight engineers hadn’t had the pluck to leave good jobs at Shockley Semiconductor Laboratory in 1957 to start Fairchild Semiconductor, hundreds of subsequent Fairchild spinoffs, including Intel and AMD, would never have been formed—and Silicon Valley as we know it would not exist. And if today’s young entrepreneurs weren’t taking equivalent gambles with their careers, we tech journalists wouldn’t have anything to write about.

As T.S. Eliot wrote, “Only those who will risk going too far can possibly find out how far it is possible to go.” Kahneman and the behavioral economists didn’t need to prove that humans are irrational: the tech world did that long ago.

The Author

Wade Roush is a contributing editor at Xconomy.

By posting a comment, you agree to our terms and conditions.

  • JerryA

    Great summary, and in my view, a brilliantly insightful analogy to the paradox with the tech world in general.

    The only thing I take issue with is the belittling of Freakonomics and Dan Ariely up front. Arguably Freakonomics has had a far greater impact on the world than Kahneman has, and continues to be more accessible with their very popular podcast. Ariely is brilliant too, bridging academia with Joe Public well. Dan is still relatively young, so expect much more from him. (Dan’s podcast, “Arming the Donkeys” is also good)

    This takes nothing away from the greatness of Kahneman, who’s book is based on his and others’ research. From a purely academic sense though, I would argue that Scott Plous’s 1993 book “The psychology of judgement and decision making” is just as good. (and a much shorter read)

    As an decision making “freak”, I prefer Plous and Kahneman, but I respect and difference, and support the far greater impact that Freakonmics and Dan Ariely make to society.