IT Matters: Nicholas Carr on Utility Computing, the Dangers of Internet Culture, and the Google Brain

1/8/08Follow @wroush

In 2003, Nicholas Carr, then the executive editor of Harvard Business Review, sparked an enormous debate (and enraged quite a few technology vendors) with an HBR article entitled “IT Doesn’t Matter.” Because every company now has access to the same commodity computing hardware and software, Carr asserted, big IT investments no longer confer a competitive advantage, the way they did in the early days of corporate computing.

He reiterated his case (hedging a bit) at book length in 2004′s Does IT Matter? and became a fixture on the technology lecture circuit, polarizing audiences at every stop with his argument that, in the end, it makes little difference whether a company chooses HP or Sun, Oracle or IBM, Windows or Linux: that’s not where companies should be looking for an edge.

Some people read Carr’s original article as saying that information technology itself is unimportant. That’s a misinterpretation that, as Carr admitted in Does IT Matter?, may have been “traceable in some cases to a lack of clarity in defining the terms and scope of my argument,” and he spent much of the book trying to correct it. Indeed, he insisted that the replacement over the last two decades of expensive, customized, proprietary computing systems by a standardized infrastructure was a “natural, necessary, and healthy process” that continues to lift the entire global economy.

The Big Switch: Rewiring the World, from Edison to Google, by Nicholas Carr (cover)If there is any doubt left that Carr believes information technology to be critical to modern society, his latest book—published yesterday—should erase it. The Big Switch: Rewiring the World, from Edison to Google (W.W. Norton) takes a far broader view of the IT landscape. The book draws out the compelling parallels between an earlier technological transformation—electrification and the rise of electric utilities—and the emergence of “utility computing,” the transfer of many of the software and storage tasks once handled by our personal computers and local servers to far-away data centers handling millions of tasks simultaneously. Google’s online spreadsheet, word processing, and calendar software, which competes directly with Microsoft’s desktop productivity programs, is a low-intensity, consumer-oriented example of utility computing, while Salesforce.com’s Web-based account management software for small-to-medium-sized companies is one of the prime “software-as-a-service” offerings giving older business-software companies like Siebel and Oracle fits.

Aimed at general rather than managerial audiences, The Big Switch thoroughly explores both the positive and negative effects of the “computing cloud” that’s settling quickly over consumers and small businesses (and only slightly less rapidly over large corporations). Among the potentially negative effects is what Carr calls “the great unbundling”—the fact that Web technology has removed most of the constraints on the distribution of creative work and made it possible for consumers to obtain content without paying a subscription or even viewing an advertisement. Some of the best creative work, Carr worries, may be “crowded out of the marketplace by the proliferation of free, easily accessible” alternatives.

Carr, who lives outside Boston, visited Xconomy’s office last week to talk with me about his book and its themes. You can read the entire 6,000 word transcript of our conversation here (including fascinating digressions on Lewis Mumford, technological determinism, the future of newspapers and magazines in an era of information commodification, and the value of blogging versus book-writing). For readers with slightly less time to commit, here’s a handy streamlined version.

Xconomy: This book has a slightly different perspective from Does IT Matter?, which looked at whether IT matters on a more fine-grained scale to companies and organizations. And if I can summarize that book, the answer was no, because IT is ubiquitous and therefore there’s no particular competitive advantage to any one company. But now you’re saying that in a much broader sense, IT as a utility, just like electricity, does matter. In fact it makes a huge difference to the way our society is evolving.

Nicholas Carr: I’m taking a completely different perspective, really. The new book could even be called IT Matters, because it’s looking at all the ways that information technology does change things, in what I think are very important ways. The original book was written mainly for a managerial audience, who were thinking “Can I get a competitive advantage from these enterprise systems?,” and my answer there was largely no. This book is written for a much broader audience. And as I said, it really looks at what happens when IT—partially as a result of shifting to the utility model—becomes much cheaper, much more available, much more ubiquitous.

X: So there is a shift in perspective from your last book to this one. But there is also a shift in tone or perspective within the book itself. The first half is largely about the philosophy and logic behind utilities and how they arise, and how that laid the foundation for the amazing industrial growth seen in the U.S. in the 20th century, and for the growth of the middle class and all of the things that spread from that. That’s a hopeful story. Then in the second half of the book you turn the tables and say, “Okay, well, now that we have this stuff, let’s not pretend that it’s totally rosy.” I wonder where you come down in the end. On the whole, do you see the rise of computing as a utility as the path to an even brighter future, if we do it right?

NC: First of all let me step back and say I struggled with the structure of the book. As I got further and further into the research, I found myself being pulled in two ways. I was very enthusiastic about the story of technological discovery and advance, and the in some ways quite heroic efforts of the people who saw that mechanical power could become this cheap utility, and in a similar way, the people today pushing computing forward. As I say in the book, it’s natural to be very enthusiastic about that, because it’s a human achievement that has great effects on people. It also tends to spur a great deal of general optimism about the future, about progress, and about the technology.

On the other hand, as I thought more and more about the implications of this new computing grid, I became more and more concerned—in a quite despairing way at some points—about what could happen when these forces are unleashed. So the book does make this shift. And the hinge is the story of the effects that electricity had. There were many good effects, but when you compare it to the utopian dreams that were espoused at the very beginning of the electric utilities, you see that the effects were really much more complicated, and were both good and bad.

[Today] we have this great new technological system being built. But while there is much good that comes from this, I lay out several reasons for worry. Where I come out on it is that the dangers to society and to culture and most importantly to our sense of personal identity are greater than the benefits. And it has to do with taking the ethic of the computer—meaning very fast, very automated, also very structured in some ways—and beginning to apply it to the processes by which we communicate with each other, and create culture, and even define ourselves.

X: I guess I’m more of an optimist than you seem to be. To take one example, with today’s blogging tools it’s so easy to upload text, images, and recordings, and package them in a nice way, and even self-finance it by putting Google AdSense ads on your site. None of that was possible just a few years ago. To me that’s sort of like handing out canvas and paint to many more people than could have ever considered being painters before. So aren’t we creating the space for more Michelangelos?

NC: No, I don’t think so. It’s great that people have new ways to express themselves. But I don’t see the connection between that and instigating great art. In fact, the bad side of this is that it creates a superficial relationship between people and expression of all sorts. The net is training us to see all of this stuff as pretty much disposable. It seems to me that great art and truly great expression isn’t something that comes quickly. It comes through long hard work and contemplation and slow thinking….. let’s not pretend that dashing off blog entries and reading other people’s blog entries and commenting might not take time away from other valuable cultural things you could be doing with your time, and doesn’t imply some kind of loss as well as some kind of gain. My guess is that it probably means we will have fewer Michelangelos rather than more.

X: Let’s go back to the computing cloud, and talk specifically about the information utilities of today. Google obviously figures prominently in the book and in all of our lives. And going along with their massive presence on the Internet, they are actually building these massive computing plants like this one along the Columbia River in Washington. The parallel between what they are doing and what Edison and [Samuel] Insull [Edison's secretary, later founder of Commonwealth Edison] did is really interesting. At first everyone thought that they needed their own mainframes, and now it’s turning out that computing is more general and everyone can tap into it like a utility, and specialist organizations are kind of taking over, and Google is clearly one of those. Do you feel like that trend has a long way to go before it fully plays out?

NC: I think different spheres of computing will be affected at different paces. I think we are going to see continued massive investment in the computing grid. Google is certainly the most visible. But Microsoft is now scared and is throwing as many billions into it as Google. I think we’re seeing IBM beginning to make bigger investment, and lots of smaller companies, like Salesforce.com and Intuit. They’re all realizing that if these software programs are going to be served over the Net and hopefully scaled up to millions of users, it takes a lot of hardware and a lot of electricity.

And of course as that happen, similar to what happened with electricity, it kind of spurs itself. Because inevitably as capacity grows, the price goes down, the capabilities go up, and there is less and less reason over time for going out and buying a piece of packaged software and installing it on your hard drive or your company’s server.

On the consumer side, most young people today have already moved into the cloud for most of their computing, and I think we’ll continue to see this [evolve] quite rapidly over the next five years. Most people at home doing their taxes or storing their photographs will go online. Small businesses will do what consumers are doing pretty quickly as well, because they don’t want to buy all this gear and hire people. Big companies are going to be the slowest, because they have huge investments in IT. They often have very specialized software. Google isn’t going to create robust Web-based programs to handle very esoteric business processes, at least not quickly. So I think for the next 10, 20 years, larger business computing is going to be very much a hybrid. They are going to pull in a lot of the more basic computing services, whether it’s raw compute power, storage, or the basic enterprise systems for accounting and account management. Those will go online quickly. But it’s going to be years before they close up their data centers altogether.

X: One of the companies we cover all the time is VMware, which makes virtualization software. You mention how dysfunctional the distribution of computing power has become, because every company went out and built its own data center to meet peak demand and then uses it 20 percent of the time. Virtualization offers the possibility of using that other 80 percent.

NC: Not only did they build their data centers for peak demand, they built each application and each separate computer for peak demand.

X: Exactly. So the market for virtualization is still huge, and there will be this wave of companies trying to make most efficient use of their existing infrastructure. Then maybe they’ll figure out how to plug that into the cloud, so their virtualized data centers are part of the larger virtualized cloud.

NC: I think that is exactly how it will happen. We’re starting to see it already. Big corporations are realizing that virtualization can save them literally billions of dollars. I think we will see companies take the utility technologies that software-as-a-service firms are using and revamp their internal IT along those lines. Essentially, they will run their own utility. And over time, as that happens, the lines between the private ones and the public ones will start to blur as capacity is allowed to shift where it needs to go.

X: When a revolution sets in, often you think “This is going to change everything forever.” And that’s sometimes true and sometimes it’s not. The utility model in electricity is pretty much here to stay. And the PC revolution would have seemed pretty permanent in, say, 1995, but now, as you say in your chapter on the retirement of Bill Gates, the PC era is coming to an end. Do you see the utility computing era as something that is more likely to be here to stay, at least for the foreseeable future, or is there yet another revolution hiding inside that somewhere, waiting to emerge?

NC: I think that some model of utility competing, if we define it broadly as a shared infrastructure for computing, is going to be the way we do computing for the future. If you assume that all of this stuff is going to stay networked—and I don’t see how we can go back to not being networked—that implies you are going to supply the resources in as economically efficient a way as possible through that network.

One thing I’m happy to admit is I don’t know what the ultimate structure of the utility computing industry will be—whether it’s going to be four companies that run everything or a bunch of smaller companies that have hashed out really strong standards so that data can flow between them very easily. But it does seem to me that it will involve the centralization of a lot of computing functions that have been fragmented.

X: At the very end of the book you point toward the possibility of brain-machine interfaces, and how Sergey Brin and Larry Page at Google are obsessed with this idea of tapping directly into the human brain, and at the same time taking Google’s massive databases and somehow endowing them with artificial intelligence. Does that strike you as something that’s any more realistic now than it was in, say, 1968, when we first met HAL in 2001: A Space Odyssey?

NC: Yes, it does strike me as more realistic. But I don’t think what’s being built is a replica of human intelligence. We are at the stage now where there is so much data connected, and so many microprocessors, and such powerful microprocessors, that I think the nature of computer programming is going to change. It’s going to be much more along biological lines, where we train computers to see patterns. And as we move toward a more semantic Web, where information is coded in a much richer way that allows computers to make connections between information on their own, I think we will see computing systems becoming able in some rudimentary way to think and make decisions without the kind of human guidance that has been necessary in the past. Google wants to do this, and I take them at their word.

Read the unedited interview transcript here.

Wade Roush is a contributing editor at Xconomy. Follow @wroush

By posting a comment, you agree to our terms and conditions.