Welcome to the Seven-Year Technology Pause

Welcome to the Seven-Year Technology Pause

What’s even harder to endure than the collapse of an economic or technology bubble? The long lull that follows.

The current bubble hasn’t popped quite yet—and it might not, at least not in the sudden and messy way that the dot-com bubble did, in March 2000. What’s just as possible is that that we’ll see a slow leak. I’m talking about a gradual downgrading of expectations about the technology economy—reflected in concrete measures like falling company valuations, lower rates of venture investment and new company formation, and a slowdown in IPO and M&A activity—but without a dramatic crash on the NASDAQ or the NYSE.

In any case, I expect the mobile/cloud/Internet bubble to end in the next year or so, whether it happens with a bang or a whimper. There are just too many signs of irrational excess to ignore; something has to give. What I want to discuss today is what comes next. If you think of yourself as a technology entrepreneur, an innovator, or an early adopter, you won’t be happy with my predictions.

I think we’re in for an extended pause in technology progress: a period with no major changes in the way computing and communications are structured.

We’ve seen such pauses in the past. They’re the downhill sides of longer cycles of roughly 15 years in the development of computing. The current cycle began in 2007, so going by my theory, we won’t see the next set of game-changing innovations until the early 2020s—well into the second term of the Hillary Clinton or Jeb Bush administration.

In a place like Silicon Valley, when boom times bring dizzying change and the mouthwatering prospect of instant wealth, the idea of roughly seven years of stasis might sound terrifying. But it’s not a death sentence; it’s just a breather. For average consumers, it might even be a good thing. It would give us time to reshape our habits, laws, and institutions to adapt to the enormous changes we’ve already seen in this century. And it’s worth noting that I’m mainly talking about information technology here. I don’t think there’s any reason to expect—and we certainly can’t afford—a slowdown in areas like clean energy, transportation, food production, and life sciences.

Why do I think we’re at the beginning of a seven-year pause? Because of history. Even if chipmakers keep finding new ways to lower the cost of computing power, the way this power gets organized, channeled, and used by consumers doesn’t change smoothly. It advances in violent jolts. The advent of the Mosaic Web browser in 1993 was one such jolt, as was the introduction of the iPhone in 2007. Since then, I don’t think we’ve seen a single game-changing new innovation, nor do I see one on the near horizon.

Wearables? No, that’s an area marked by tentative and incremental advances. Virtual reality? It’s still many years away from the mainstream, whatever Facebook’s reasons for spending $2 billion on Oculus. Virtual assistants? Ditto—Siri, Google Now, and Cortana are interesting, but the AI advances needed to make these systems truly useful will take a while longer. (But this is actually my favorite candidate for the next big jolt; more on that below.)

My perspective comes from looking at patterns of acceleration and deceleration in technology development over the last 75 years, and from thinking about periodicity in other fields. One of my favorite professors in college was the evolutionary biologist Stephen Jay Gould. In the early 1970s, Gould and his colleague Niles Eldredge proposed the idea of “punctuated equilibrium” to explain big gaps or jumps in the fossil record that, under a more traditional, gradualist view of Darwinian evolution, shouldn’t have been there.

The story Gould and Eldredge came up with—and it’s accepted today by most paleontologists—is that evolution proceeds in fits and starts. There are rare, sudden moments when thousands of new species can emerge. In between, there are long periods of stasis when there’s still genetic variation from generation to generation, but species mainly wobble around a “phenotypic mean.” The changes don’t accumulate; a finch stays recognizable as a finch.

Information technology doesn’t evolve in the same sense that species do. The forces and time scales involved are completely different. But if you follow the history of computing hardware, you still see patterns that look a lot like punctuated equilibrium. The periods of major “speciation” and branching are brief and relatively rare, and they seem to come every 15 to 20 years or so. Here are a few of computing’s big breakthrough moments:

1946: The construction of the first electronic general-purpose computers like ENIAC, powered by vacuum tubes.

1964-65: The advent of mainframes like the IBM 360 and minicomputers like the DEC PDP-8, powered by transistors and integrated circuits.

1981-84: The PC revolution, exemplified by the IBM PC, the Commodore 64, and the Macintosh. (This was a long-lasting change: our laptops today are just portable PCs.)

1993-1996: The first stirrings of handheld computing, with the Apple Newton and the Palm Pilot. These devices were commercially inconsequential, but they laid the groundwork for…

2007-2008: The smartphone revolution, led by the Apple iPhone, quickly followed by the first Android phones.

That’s just on the hardware side, of course. In networking and communications, there’s been a parallel set of advances, like the creation of the Arpanet in 1969 and the full commercialization of the Internet in 1995. But the key events in networking were the invention of the World Wide Web in 1989 and the development of the Mosaic browser in 1993. Mosaic popularized the Web and led quickly to the founding of Netscape and the whole Dot-com and social media phenomenon.

Note just how much time went by between Mosaic and the iPhone, the two signature events of the modern computing era: 14 years. The first seven years, from 1993 to 2000, saw the rise of e-commerce, a huge wave of venture capital investment in free-spending Internet companies, enormous hype, and a traumatic crash, with much attendant drama in the way of bankruptcies and accounting scandals. The next seven years, from 2000 to 2007, saw cost-cutting, far more caution on the part of investors, and a slow rebound.

In the shadows, two key trends took shape: first, the use of “Web 2.0” tools like JavaScript and XML to make Web pages far more interesting and interactive, and second, the rise of cloud computing, embodied in Software-as-a-Service companies like Salesforce.com and storage and processing utilities like Amazon Web Services. But these were infrastructure technologies. While they provided the foundation for much future progress, they didn’t fundamentally change the way we use computers.

The next big jolt didn’t come until January 9, 2007, when Steve Jobs unveiled the iPhone. The device wasn’t notable for its technology, per se: camera phones had been around for a long time, as had touchscreens, portable music and video players, and even mobile apps. The genius of the iPhone was in the way it organized all these elements and made them a joy to use. With the addition of the iTunes App Store in 2008, Apple also conjured up a marketplace for third-party software that would create a new profession of mobile developers and make the iPhone ever more useful and powerful.

The hypothesis I’m offering is that the biggest revolutions in information technology are the ones like Mosaic and the iPhone that inspire us to reorganize the way computing fits into our lives, thereby creating lots of room for follow-on innovation. Those are the kinds of breakthroughs that only seem to come along every 15 years or so. The first half of the intervening stasis period seems to be about experimentation and growth; the second half is about retrenchment and mopping up. We experienced the last big shift in 2007, so the question is, what now?

From a business perspective, the post-iPhone period (2007-2014) bears a pretty strong resemblance to the post-Mosaic period (1993-2000). We’ve witnessed strong gains in venture investment, an explosion in the number of tech startups, and a monstrous increase in the population of tech millionaires and billionaires. It looks a lot like a bubble, and it will likely go bust. If investors are rational enough to pull back gradually, we could have a soft landing rather than a crash.

Either way, it’s looking likely that 2015-2022 will be a repeat of 2000-2007, meaning that we’re mainly just riding out the cycle until the next big speciation event. The phyla we’ve got are swimming along fine—people are still buying iPhones at the rate of 160 million per year (though iPad sales are slumping, indicating the tablet market may already be saturated). But an iPhone 5s is still recognizable as an iPhone; we’re wobbling around the mean. I see no signs that any reorganizing innovation will arrive in the near term, and if it did, it would be ahead of schedule.

If I had to guess where the next organizing innovation will happen, around 2022, I’d say artificial intelligence. In a way, both Mosaic and the iPhone were about taking existing collections of digital data (text documents, photos, songs, videos) and making them more accessible, remixable, and useful. A truly smart smartphone would be able to take all that stuff, plus everything it knows about us and our relationships and our surroundings, and act as an omnipresent guide, counselor, watchdog, and concierge. Siri and Cortana could become the grandmothers to a whole new race of virtual assistants, each tailored to its owner, like Samantha in Her.

Let’s just hope they don’t get together and decide it’s time for a mass extinction. 

The Author

Wade Roush is a contributing editor at Xconomy.

By posting a comment, you agree to our terms and conditions.

  • Cynthia R. Cauthern

    Great article and thought provoking. Eight years until the next information technology break through is a long time to wait, but it is factual that these paradigm shifts happen every 15-20 years.

  • bespoked

    Many thanks for yet another great, thought-provoking article. This is one of those pieces whose implications I will be pondering for, oh, say, at least seven more years! :-)

  • MITDGreenb

    Great article, Wade!

    I humbly submit my own version, written one cycle ago: http://www-07.ibm.com/services/pdf/IBM_Consulting_crash.pdf (PDF) That, in turn, references a discussion from Warren Buffet observing the same boom-bust-flatline pattern you discuss… in the DOW: http://money.cnn.com/magazines/fortune/fortune_archive/2001/12/10/314691/

    See what you miss from us here in Boston? :) (We miss you!)

    • http://www.xconomy.com/san-francisco Wade Roush

      Thanks Dan. That IBM white paper is a real blast from the past. Do you share my sense that we’re now at or near the top of another cycle, and cruising for a bruising?

      • MITDGreenb

        I view things from the framework of that paper. The Financial Markets are in their flat cycle, starting 2001. That goes to about 2018. I like your idea that innovation markets follow a similar pattern, with 2020-2022 the next game-changer. As long as they don’t coincide, we won’t get too bruised IMO.

        But what is that game changer? I agree with you that it’s some new way of organizing/disseminating information (driven by a complementary ad model), but I can’t visualize what it is other than something BORG-like.

  • http://www.viddy-up.com/ Gregkumar

    Hi Wade Roush,

    Great info on technology. I’ve started becoming proficient myself in technology
    and have actually been selling video tutorials to earn some additional revenue.
    It’s pretty easy to do and I thought you might be interested so check it out
    when you get a chance, it’s free to sign up http://www.viddy-up.com/