New! Sign up for our free email newsletter.
Science News
from research organizations

New chip uses AI to shrink large language models' energy footprint by 50%

Date:
May 8, 2025
Source:
Oregon State University
Summary:
Researchers have developed a more efficient chip as an antidote to the vast amounts of electricity consumed by large-language-model artificial intelligence applications like Gemini and GPT-4.
Share:
FULL STORY

Oregon State University College of Engineering researchers have developed a more efficient chip as an antidote to the vast amounts of electricity consumed by large-language-model artificial intelligence applications like Gemini and GPT-4.

"We have designed and fabricated a new chip that consumes half the energy compared to traditional designs," said doctoral student Ramin Javadi, who along with Tejasvi Anand, associate professor of electrical engineering, presented the technology at the recent IEEE Custom Integrated Circuits Conference in Boston.

"The problem is that the energy required to transmit a single bit is not being reduced at the same rate as the data rate demand is increasing," said Anand, who directs the Mixed Signal Circuits and Systems Lab at OSU. "That's what is causing data centers to use so much power."

The new chip itself is based on AI principles that reduce electricity use for signal processing, Javadi said.

"Large language models need to send and receive tremendous amounts of data over wireline, copper-based communication links in data centers, and that requires significant energy," he said. "One solution is to develop more efficient wireline communication chips."

When data is sent at high speeds, Javadi explains, it gets corrupted at the receiver and has to be cleaned up. Most conventional wireline communication systems use an equalizer to perform this task, and equalizers are comparatively power hungry.

"We are using those AI principles on-chip to recover the data in a smarter and more efficient way by training the on-chip classifier to recognize and correct the errors," Javadi said.

The Defense Advanced Research Projects Agency, the Semiconductor Research Corporation and the Center for Ubiquitous Connectivity supported the project, which earned Javadi the Best Student Paper Award at the conference.

Javadi and Anand are working on the next iteration of the chip, which they expect to bring further gains in energy efficiency.


Story Source:

Materials provided by Oregon State University. Original written by Steve Lundeberg. Note: Content may be edited for style and length.


Cite This Page:

Oregon State University. "New chip uses AI to shrink large language models' energy footprint by 50%." ScienceDaily. ScienceDaily, 8 May 2025. <www.sciencedaily.com/releases/2025/05/250508113141.htm>.
Oregon State University. (2025, May 8). New chip uses AI to shrink large language models' energy footprint by 50%. ScienceDaily. Retrieved May 9, 2025 from www.sciencedaily.com/releases/2025/05/250508113141.htm
Oregon State University. "New chip uses AI to shrink large language models' energy footprint by 50%." ScienceDaily. www.sciencedaily.com/releases/2025/05/250508113141.htm (accessed May 9, 2025).

Explore More

from ScienceDaily

RELATED STORIES