THE FUTURE AS SEEN THROUGH TECHNOLOGY
May 17, 2004
Setting speed records over the Internet isn't an easy task. Technically light travel over a fiber optic network travels, well, at the speed of light. So to conduct its Internet land speed contest, the Internet2 consortium had to find some other metrics to measure speed by, and in this case it's raw capacity over distance. Internet2 members from the California Institute of Technology and CERN took the prize, sending data 11,000 kilometers between Caltech's LA laboratories and CERN's campus in Geneva at a rate 6.25 Gb/s. That's a high-water mark of 68,431 terabit-meters per second, packed into a single data stream connecting two servers on opposite ends of the world. And it was accomplished using the same IPv4 protocols that power the public Internet.
That's a lot of bandwidth by anyone's standards - roughly 10,000 times faster than a DSL or cable modem connection. It begs the question, is that kind of capacity truly necessary? The people at Internet2 answer with an unequivocal "yes."
Internet2 - a nonprofit consortium of 300 universities, institutions and technology companies developing the Internet of the future - considers broadband in its infant stages right now. The demand created by bandwidth-intensive applications will make the DSL and cable connections of today look like the 16-baud modems of the last decade, said Steve Corvato, director of network infrastructure for Internet2.
In fact the network run by Internet2, called Abilene, maintains one of the highest capacities in the world, connecting dozens of GigaPOPs around the country with OC-192c 10 Gb/s IP backbone supplied by Qwest Communications. On a normal day, the member institutions of Internet2 keep Abilene 10% to 15% full, but researchers also use the network as a massive proving ground for the latest Internet technologies and experiments, which include the development of IPv6.
"This is effectively the research network of U.S. higher education," Corvato said. "It may seem like a lot, but the type of Internet environment we have on campuses now is what experts believe we'll have at home when real broadband is deployed. And we don't consider 1 Mb/s real broadband."
Just as home broadband use graduates to academic levels, science and research and will place even further demands on the Internet's capabilities. A study by the U.S. Department of Energy found that the network requirements for scientists in the high-energy physics, astrophysics, fusion energy, climatology, bioinformatics and other data intensive fields will reach the terabit-per-second range within the next decade. CERN, in conjunction with Fermi Lab in the U.S., is now building the Large Hadron Collider, a particle accelerator with a 27-kilometer circumference , which, when finished in 2007, will be the most powerful scientific instrument in the world, generating petabyte after petabyte of data that will need to be transported to research institutions all over the world.
"It's not just physics and astrophysics that are very data intensive though," Corvato said. "Every natural science is becoming data intensive, driving the need for very distributed, very computational intense networks around the world."