Cold Space with Power: [2N+1] Opens Boutique Data Center in Somerville

7/30/08Follow @wroush

When I pulled up to 35 McGrath Highway in Somerville, just a couple of doors down from Sav-Mor Liquors, all I found was a squat, brown, windowless concrete building and an unpaved parking lot. It was a hot day in mid-July, and I was searching for a new data center company with the geeky name [2N+1] (the brackets are part of the name). But there was very little about this old building, wedged between the McGrath overpass on one side and weed-lined railroad tracks on the other, that looked high-tech.

But in the computing world, I reminded myself, data centers are about as unsexy as it gets. They aren’t about software—the flashy Web-based games, social networking and media-sharing services, or business systems we often write about here. They aren’t even about the hardware that runs the software—the gleaming, blinking racks of blade servers from the likes of HP or EMC. Data centers simply provide space, electricity, cooling, and connectivity for that hardware. They are, in other words, the technology behind the technology behind the technology; as long as the power is on, nobody notices them. The less flash, the better.

Which is why I concluded that I must be in the right place. I stepped over some construction detritus, went in the unmarked door, and met two of the founders and principals at [2N+1], Vincent Bono and Will Locandro. They showed me into the conference room. As it turned out, this was one of the only finished spaces in the five-story building. But the flurry of remodeling would soon be over, Bono assured me. “We are 30 days from launch, and we have customers ready to move in on day 31, literally,” he said.

The 2N+1 buildingThe construction didn’t faze me, since I was visiting [2N+1] to find out what it takes to put together a brand-new data center—or, to be more exact, a “colocation center” (also known as a “carrier hotel”) where multiple companies that have run out of room or power in their own facilities put their extra server, network, and storage gear. Though I’ve been writing about information technology for a decade, this is one corner of the business I’ve never really explored. I’ve been wanting to fill that gap recently—in part because data centers are now at the heart of so many companies’ IT infrastructures, and in part because the idea of “cloud computing,” or trusting computing jobs to far-away resources that are administered like utilities, is catching on so fast.

But there’s also another, more personal reason: our recent experience with hosting provider The Planet, which suffered a huge electrical explosion on May 31 at its main data center in Houston, where Xconomy’s own servers are located. We got through the crisis okay, but it left me wondering how the heck data centers work, anyway, and what can be done to make them more reliable.

Bono and Locandro seemed pretty qualified to educate me. Not only do both gentlemen have a long history working for local infrastructure providers—Bono as a network and data center designer for data centers HarvardNet and Boston Datacenters and fiber network operator Global NAPs, Locandro doing sales and business development at Cabletron, Riverstone Networks, and Enterasys—but they’d just spent months retrofitting this hulking pre-computer-age building for up to 22,000 square feet of server equipment, largely on their own dime.

The most important thing about the former furniture factory, Bono and Locandro explained to me right away, isn’t the fact that it’s virtually fireproof, or that its steel-reinforced concrete construction can support bone-crushing loads of 400 pounds per square foot on the upper floors and 2,500 pounds on the lower floors, or that it has its own 15,000-gallon diesel fuel tank, big enough to keep backup generators running for four days. No, the most important thing about it is its location. The neighborhood may look inauspicious to passers-by, but it turns out that underneath those weedy railroad tracks runs a valuable resource—fiberoptic cable. “Here’s Boston and here’s Cambridge,” says Locandro, gesturing at a map. “Pretty much every ounce of fiber between the two cities runs down these tracks.” That enables [2N+1] to tap into fibers owned by every major network provider, making its facility carrier-neutral—which appeals to customers who may already have a contract with the likes of Verizon, Qwest, Lightower, Cogent, or Level3.

Just across the tracks, moreover, is a facility owned by NSTAR, the Boston area’s largest electrical utility. “There is a lot more that goes into looking for a location than just finding a big, empty, concrete building,” says Bono. “Not only do you have to be proximate to fiberoptic resources for multiple carriers, but you need to go somewhere where the utility is friendly and can bring you power. Municipal utilities are famous for not wanting to do that, which made places like Cambridge, Shrewsbury, and Braintree a last resort for us. Somerville is a much more friendly city to do business in, in terms of utilities, permitting, and even help from the mayor’s office.”

[2N+1] is able to pull enough power from the grid to supply up to 250 watts per square foot to individual colocation customers. NSTAR could supply even more, but as every data center operator is acutely aware, each watt of power flowing into a server rack produces a certain amount of waste heat, and 250 watts per square foot is the maximum amount the building’s new water-cooled air conditioning system can handle. (On the first, third, and fourth floors, where the building’s rentable space is located, cold air will be blasted into the facility underneath the raised floors—the space acts like a vast cooling duct—and then up into the individual server racks.)

Curious about the company’s exposure to the kind of electrical-room disaster that hobbled The Planet, I asked Bono whether all of the building’s power came through a central conduit, as it did at the Houston facility. “You are limited by what the fire department will allow; you have to have one place where they can send a person to shut down power to the building,” he answered gamely. “But we do have two feeds from NSTAR, from two conduit banks that are close to each other but separated by 30 inches of concrete. They come into a substation where, theoretically, an explosion could cause a problem, but it’s usually a transformer that blows, and those are outside, 30 feet apart, separated by a concrete wall.”

[2N+1] Aerial ViewThat sounds smart to me. And to guard against power losses of any origin, there’s the aforementioned diesel generator, as well as two floors full of refrigerator-sized, battery-powered UPSs, or uninterruptible power supplies, which store enough energy to keep customers’ equipment running for up to 15 minutes. (They’re really just needed to bridge the 10- to 60-second gap while the diesel generator spins up.) The company will provide two UPSs for every customer—a primary and a backup—plus a spare nearby that can be rolled in if one of the first two breaks. In fact, the name [2N+1] is a formula summarizing the company’s whole philosophy of redundancy: if you’ve got N hardware components running, keep another N spares on hand, plus one.

As data centers go, Bono explained, 22,000 square feet is on the small side, at least compared to facilities outside Route 128 or in places like Dallas, Houston, and Washington State (where Google and Microsoft are both building enormous data centers to power their cloud-computing services). But for a facility inside Route 128—and especially for one on Boston’s doorstep—[2N+1] is “very big,” according to Bono.

Still, it’s something of a boutique operation, designed for a specific kind of customer—those needing between 250 and 5,000 square feet of colocation space, with a sweet spot around 1,000 square feet. “Those are the customers that are historically the least amount of trouble to work with,” Bono explains. “Very large companies want you to offer all of these ancillary managed services that we don’t want to get into. And very small customers want you to essentially be their IT team, and we don’t want to do that, either.”

[2N+1]‘s perfect customer, says Locandro, would be “a mid-size technology company with its own IT staff who understand that they are going to be providing their own DNS and DHCP and storage and infrastructure, and they just need a foundation of cold space with power,” says Locandro. (If you don’t know what DNS and DHCP are, then, I would guess, you need not apply.)

“The majority of our customers,” says Locandro, “are walking in and saying, ‘Hi, I’m located at One International Place, and I can’t get any more air handlers into my data center, and I can’t put a diesel generator on the roof, and I can’t get any more power from the landlord, but I do have connectivity, so I want to run a fiber ring to your facility and move our whole infrastructure there.’ We become an extension of their real estate.” And between [2N+1]‘s fiberoptic connectivity, its load-bearing floors, and its dual NSTAR conduits, it’s valuable real estate indeed—though it may not look it from the outside.

Wade Roush is a contributing editor at Xconomy. Follow @wroush

By posting a comment, you agree to our terms and conditions.