Research done by computer science professor Joel Sommers more than a decade ago has had a lasting impact on how networks consume power.
Twelve years ago, Joel Sommers was looking at the increasing power usage of all of our many connected devices and getting worried. “It was when consumer devices were really starting to explode,” says Sommers, professor of computer science at Colgate. “The iPod had already been out for a few years, and the iPhone was coming along.” Given the increase in power these new devices would need, Sommers became particularly concerned with the environmental impact of all the emissions they would generate.
“I had this realization that all of the technologies and their networks could potentially have a big impact on climate change,” he says. “No one was considering the secondary effects of all of this consumption.” Along with several colleagues at the University of Wisconsin-Madison, where he was then a doctoral student, Sommers began examining just how much power they were using and whether or not there was a better way to connect them to reduce the energy load.
The paper he and his co-authors published in 2008, Power Awareness in Network Design and Routing, proved incredibly influential in computer networking circles. In honor of its impact, the paper was just awarded a “2019 IEEE INFOCOM Test of Time Paper Award,” presented by the Institute of Electrical and Electronics Engineers—the top professional organization in his field—for the “most cited and widely recognized” paper between the last 10 to 12 years. “It’s a huge honor,” says Sommers, who joined the Colgate faculty in 2007. “In thinking about all of the other papers that came out in that three-year timespan, it’s pretty humbling to realize we were chosen for that.”
Until their paper, Sommers says, no one had ever done any systematic measurement of devices under different data load scenarios and physical configurations to understand their power consumption. The paper’s big revelation was that consumption was determined by the number of physical fiber-optic connections, not the amount of data traveling over each one. “We found that the amount of bandwidth going over a link doesn’t really impact power consumption measurably or significantly,” Sommers says.
In other words, a link being used at 30 percent capacity and one used at 80 percent consume almost exactly the same resources. That’s important because networks often build extra capacity into their systems to keep up with surges in demand. “Network providers want their networks to run at a fairly low utilization,” says Sommers. “Even 50 percent utilization can be seen as too much. But since having one link on and available consumes basically a fixed amount of power, then running them at low utilization is really the worst-case scenario.”
On the other hand, utilizing links at a high level can cause a lag in customer performance. Given that challenge, Sommers says, the paper has spurred some changes among provider networks to come up with the right trade-off between power savings and excess capacity. “There have been a lot of papers that cited our work that came up with different optimization formulations,” says Sommers.
That is especially true over the past decade with the rise of data centers, which have the potential to consume tremendous amounts of resources. “At certain times of day, the operator of a data center may not need all of the capacity running at full-tilt, so there’s some opportunity to turn off some components there,” Sommers says. While private companies generally don’t share their strategies for saving energy, he speculates some have achieved competitive advantage with better ways to shut off links to conserve power—and decrease costs.
In the past decade, Sommers has continued looking at novel ways to measure computer and network energy consumption. He has even worked those principles into a core course in scientific perspectives at Colgate to help educate first-year students on their own energy usage. “I had students do some measurement of power consumption on different electronic devices they were using, and to then take some of those measurements and extrapolate over the course of a week how much of their power footprint comes from their habits of how they use technology.”
If anything, however, Sommers’s concern over the environmental impact of energy consumption has only increased as massive data centers have consolidated power usage. “There are really a handful of them across the planet, and they consume a lot of power,” he says. If there is one area of research now that he thinks could make a lasting impact on power consumption for the next decade, it’s figuring out how to break up those centers to distribute that capacity more equally around the world. “We need to come up with a way to have less monolithic data centers,” he says, “and distribute that functionality to the edges of the network so data resides closer to where it is being used — and closer to local power generation.”