CORVALLIS, Ore. – Researchers at Oregon State University and Baylor University have made a breakthrough toward reducing the energy consumption of the photonic chips used in data centers and supercomputers.
The findings are important because a data center can consume up to 50 times more energy per square foot of floor space than a typical office building, according to the U.S. Department of Energy.
A data center houses an organization’s information technology operations and equipment; it stores, processes and disseminates data and applications. Data centers account for roughly 2% of all electricity use in the United States, the DOE says.
According to the U.S. International Trade Commission, the number of data centers has risen rapidly as data demand has soared. In the United States, home to many firms that produce and consume vast amounts of data including Facebook, Amazon, Microsoft and Google, there are more than 2,600 data centers.
The advance by John Conley of the OSU College of Engineering, former Oregon State colleague Alan Wang, now of Baylor, and OSU graduate students Wei-Che Hsu, Ben Kupp and Nabila Nujhat involves a new, ultra-energy-efficient method to compensate for temperature variations that degrade photonic chips. Such chips “will form the high-speed communication backbone of future data centers and supercomputers,” Conley said.
The circuitry in photonic chips uses photons – particles of light – rather than the electrons that course through conventional computer chips. Moving at the speed of light, photons enable the extremely rapid, energy-efficient transmission of data.
The issue with photonic chips is that up until now, significant energy has been required to keep their temperature stable and performance high. The team led by Wang, however, has shown that it’s possible to reduce the energy needed for temperature control by a factor of more than 1 million.
“Alan is an expert in photonic materials and devices and my area of expertise is atomic layer deposition and electronic devices,” Conley said. “We were able to make working prototypes that show temperature can be controlled via gate voltage, which means using virtually no electric current.”
Presently, Wang said, the photonics industry exclusively relies on components known as “thermal heaters” to fine tune the working wavelengths of high-speed, electro-optic devices and optimize their performance. These thermal heaters consume several milliwatts of electricity per device.
“That might not sound like much considering that a typical LED lightbulb uses 6 to 10 watts,” Wang said. “However, multiply those several milliwatts by millions of devices and they add up quickly, so that approach faces challenges as systems scale up and become bigger and more powerful.”
“Our method is much more acceptable for the planet,” Conley added. “It will one day allow data centers to keep getting faster and more powerful while using less energy so that we can access ever more powerful applications driven by machine learning, such as ChatGPT, without feeling guilty.”
The research was supported by Intel, NASA and the National Science Foundation and was published in Nature Scientific Reports.
Steve Lundeberg, 541-737-4039
[email protected]
John Conley, [email protected]