Calit2′s Larry Smarr on the Origins of the Internet, Innovations in IT, and Insights on the Path Ahead (Part I)
After establishing himself as a leading expert on computing and information technology, including the Internet and World Wide Web, Larry Smarr left the comfort of a job that was tailored for him at the University of Illinois, Urbana-Champaign, for a position in San Diego that offered an even better fit. In 2000, Smarr was hired as a professor of computer science and engineering at UC San Diego—and almost exactly nine years ago, he was named the founding director of the California Institute for Telecommunications and Information Technology.
Today, Smarr (who also is a San Diego Xconomist) describes the institute, pronounced “Cal-IT-squared” (and known in official shorthand as Calit2), as a framework for collaboration among students and researchers throughout the University of California system, as well as industry. In the first half of this two-part article, Smarr and I discussed his role in the development of the Internet and the factors that help encourage technology innovation. Because the institute emphasizes and encourages cross-disciplinary research, Smarr refers to Calit2’s affiliated centers as “loci for innovation,” with federal funding totaling more than $400 million since 2000.
As a researcher, Smarr also has continued working to develop technologies that combine optical networks, supercomputing, and grid technologies for use in the next-generation Internet. He views Calit2 as a “time machine” because the advanced capabilities of the institute’s high-performance network enable researchers to develop software applications five or 10 years before they can be deployed commercially. That’s something that could be true of any academic research center, although innovations developed for the Internet may stand in a class by themselves. (Smarr talks about the path forward and four big ideas in Part 2 of my story here.)
Smarr, who turns 61 on Oct. 16, traces the origins of the Internet to the early 1970s and the technical work done by Vint Cerf, Leonard Kleinrock, Robert Kahn, and others on the ARPAnet, the computer network developed by the Pentagon’s Advanced Research Projects Agency. By the mid-1980s, though, Smarr’s own career became intertwined with the development of the Internet. He was named in 1985 as founding director of the National Center for Supercomputing Applications, or NCSA, at the University of Illinois at Urbana-Champaign. During his tenure over the next 15 years, NCSA researchers, including Marc Andreessen, developed the first graphical Web browser (Mosaic) and technology that formed the basis of the popular Web server now known as Apache.
Because of the work done at the NCSA, which later became part of the National Computational Science Alliance, Smarr says he came to view university research as a crucial source for technology innovation—if not the source.
One anecdote he tells involves a presentation he gave in early 1994 to 50 chief information officers of major U.S. corporations like Wells Fargo and Mastercard that included one of the first live demonstrations of Web-browsing. Because there were only a few hundred websites in the world at that time, Smarr says his Web tour included the dinosaur museum of Honolulu Community College and a coffee pot at Cambridge University in the UK. A camera connected to the Internet took a picture of the coffee pot every few minutes so the university’s Web users could see how much coffee remained.
Smarr says the corporate technology gurus told him afterward: “You crazy academics. We can guarantee you that there is no business use for this World Wide Web of yours.” He adds, “Now that is a pretty interesting statement, because we tend to think that innovation comes from the private sector. But in fact, innovation typically comes out of the universities.” He contends that technology innovation is transferred from universities into startups, where it is distilled for adoption by larger companies into products.
Smarr says he saw three key modes of innovative technology transfer from the University of Illinois during those years: “One was that the people left and formed a startup called Netscape. The second was that Microsoft licensed [the Mosaic browser] and formed Internet Explorer. Just look at the ‘about’ box on your Web browser,” Smarr says The third method of technology transfer came through open source development. “Apache took our NCSA Web server and created the Apache server, which is the most widely used Web server today.” He estimates that a trillion dollars’ worth of technology innovation came out of the university during the 1990s.
“At the end of the day,” Smarr says, “I didn’t invent anything to do with the World Wide Web, but I did create an environment in which it could flourish. The most heavily hit website in the world in 1994 was NCSA in East Central Illinois. We invented the parallel Web server, which has now grown to scales of Google and everything else.”
In the beginning, however, Smarr says he was mostly interested in theoretical physics, astronomy, and cosmology. He got his Ph.D. in astrophysics from the University of Texas at Austin, developing mathematical techniques needed to conduct computational-based research—using supercomputers to analyze the dynamics of such phenomena as colliding black holes. But he notes ruefully, “When I got my Ph.D. in ’75, my Ph.D. advisor said, ‘OK, now you need to get a top secret clearance to do nuclear weapons research.’ I said, ‘But my research is in open science.’ And he said, ‘The only place where we have supercomputers are the national nuclear weapon design laboratories like Livermore and Los Alamos.’”
Instead, Smarr says he left in the late 1970s for Germany’s Max Planck Institute for Physics, where he had the opportunity to conduct “open” research in relativistic astrophysics, using the first Cray supercomputer in Europe.
After he returned to the U.S. in the early 1980s, Smarr says he wrote an unsolicited proposal to the National Science Foundation for a $55 million grant to establish a national supercomputer center. He planned to conduct open scientific research and wanted to make a U.S. supercomputer available to scientists and universities throughout the country. The NSF decided instead to hold a nationwide competition to establish five supercomputer centers, and Smarr learned his proposal would be considered as part of that competition, along with a similar proposal that had been submitted for a supercomputer center in San Diego. The NSF funded both proposals in 1985, which is how the San Diego Supercomputer Center was established at UCSD at the same time as the NCSA in Illinois.
At about the same time, Smarr says, “we got into saying, OK, well, don’t we need to connect these supercomputers?”
For Smarr, this became one of the key events in the development of the Internet. In the mid-80s, he says most computer application scientists, especially the physicists with clout at the NSF, used the instruction set architecture developed by Digital Equipment Corp., which was known as the DEC VAX. “The natural thing would have been to connect [the supercomputers] with the commercial, proprietary DEC-net,” Smarr says.
But Smarr says he argued “very strongly” for the NSF to instead adopt an open standard for its set of network communications protocols, which was known as Transmission Control Protocol and the Internet Protocol, or TCP/IP. “I was only one of the people that was arguing for that,” Smarr says. “But I was pretty persistent.”
In fact, Smarr says, “We convinced the NSF to make a ruling that NSF money would only be used in grant requests for networking if they used TCP/IP. Now you can call this a gross intervention of the federal government in the private sector, picking winners and losers. But without it, we would not have the Internet. So this was a great lesson to me, that there are moments when the government needs to call the question, say this is the national policy, and then allow the market to adapt to it.”
Considering all the current concerns about lax cybersecurity and the vulnerability of existing Internet protocols, I asked Smarr if he would do it differently if he had the chance to do it over again.
“That’s a research question,” he replies. “The NSF has a major program, called GENI [for Global Environment for Network Innovations], which is underway and that is to re-invent the Internet for the 21st century… We’ll see what comes out of that effort. “On the other hand,” Smarr adds, “I don’t know if there’s a single engineering construct created by humankind that has grown as many orders of magnitude as the Internet has—and still functions fairly well as an engineering structure.”
I also asked Smarr if he’s been personally involved in any startups.
Smarr says he was on the board of directors at San Diego-based Entropia, a PC grid computing startup that was founded in 1997 and stalled a few years ago—a victim of what he calls “the downdraft of 2000-2001.” The founders included Andrew Chien, a colleague who was Entropia’s chief technology officer, and who is now vice president of Intel Labs in Seattle and director of Future Technologies Research for Intel Corporation.
But in general, Larry Smarr doesn’t do startups.
“I guess it’s because of two things,” he says. “One, you have to decide where you can add the most value to the end customer and what role to play. And I feel that I’m a scientist. I’m a researcher. So I think that keeping innovation alive and nourished in the university is the way I can add the most value to society.
“The second thing,” Smarr says, “is I believe that making a successful startup is one of the most difficult human endeavors there is. It’s sort of like becoming the Rolling Stones—starting as a garage band, and it’s got about the same attrition rate, probably 1,000 to one, from business plan to a cash-positive successful startup. So I have nothing but the greatest admiration for the people who set out on that quest.”
Next: Larry Smarr on the path ahead for the Internet.