IT Matters: Nicholas Carr on Utility Computing, the Dangers of Internet Culture, and the Google Brain

(Page 3 of 3)

create robust Web-based programs to handle very esoteric business processes, at least not quickly. So I think for the next 10, 20 years, larger business computing is going to be very much a hybrid. They are going to pull in a lot of the more basic computing services, whether it’s raw compute power, storage, or the basic enterprise systems for accounting and account management. Those will go online quickly. But it’s going to be years before they close up their data centers altogether.

X: One of the companies we cover all the time is VMware, which makes virtualization software. You mention how dysfunctional the distribution of computing power has become, because every company went out and built its own data center to meet peak demand and then uses it 20 percent of the time. Virtualization offers the possibility of using that other 80 percent.

NC: Not only did they build their data centers for peak demand, they built each application and each separate computer for peak demand.

X: Exactly. So the market for virtualization is still huge, and there will be this wave of companies trying to make most efficient use of their existing infrastructure. Then maybe they’ll figure out how to plug that into the cloud, so their virtualized data centers are part of the larger virtualized cloud.

NC: I think that is exactly how it will happen. We’re starting to see it already. Big corporations are realizing that virtualization can save them literally billions of dollars. I think we will see companies take the utility technologies that software-as-a-service firms are using and revamp their internal IT along those lines. Essentially, they will run their own utility. And over time, as that happens, the lines between the private ones and the public ones will start to blur as capacity is allowed to shift where it needs to go.

X: When a revolution sets in, often you think “This is going to change everything forever.” And that’s sometimes true and sometimes it’s not. The utility model in electricity is pretty much here to stay. And the PC revolution would have seemed pretty permanent in, say, 1995, but now, as you say in your chapter on the retirement of Bill Gates, the PC era is coming to an end. Do you see the utility computing era as something that is more likely to be here to stay, at least for the foreseeable future, or is there yet another revolution hiding inside that somewhere, waiting to emerge?

NC: I think that some model of utility competing, if we define it broadly as a shared infrastructure for computing, is going to be the way we do computing for the future. If you assume that all of this stuff is going to stay networked—and I don’t see how we can go back to not being networked—that implies you are going to supply the resources in as economically efficient a way as possible through that network.

One thing I’m happy to admit is I don’t know what the ultimate structure of the utility computing industry will be—whether it’s going to be four companies that run everything or a bunch of smaller companies that have hashed out really strong standards so that data can flow between them very easily. But it does seem to me that it will involve the centralization of a lot of computing functions that have been fragmented.

X: At the very end of the book you point toward the possibility of brain-machine interfaces, and how Sergey Brin and Larry Page at Google are obsessed with this idea of tapping directly into the human brain, and at the same time taking Google’s massive databases and somehow endowing them with artificial intelligence. Does that strike you as something that’s any more realistic now than it was in, say, 1968, when we first met HAL in 2001: A Space Odyssey?

NC: Yes, it does strike me as more realistic. But I don’t think what’s being built is a replica of human intelligence. We are at the stage now where there is so much data connected, and so many microprocessors, and such powerful microprocessors, that I think the nature of computer programming is going to change. It’s going to be much more along biological lines, where we train computers to see patterns. And as we move toward a more semantic Web, where information is coded in a much richer way that allows computers to make connections between information on their own, I think we will see computing systems becoming able in some rudimentary way to think and make decisions without the kind of human guidance that has been necessary in the past. Google wants to do this, and I take them at their word.

Read the unedited interview transcript here.

Single PageCurrently on Page: 1 2 3 previous page

Wade Roush is the producer and host of the podcast Soonish and a contributing editor at Xconomy. Follow @soonishpodcast

Trending on Xconomy