Brain’s ‘gears’ key to energy-efficient AI

Data centre. | Newsreel
Work is under way to make data centre-reliant tech more energy efficient. | Photo: IR Stone (iStock)

The human brain could hold the key to minimising the climate impact of energy-hungry technology like ChatGPT.

University of Sydney researchers are developing an AI method to reduce the energy required by data centres, which could help reduce the carbon footprint of large language models like ChatGPT.

Associate Professor Chang Xu said large language models were expected to increase global energy consumption.

But Associate Professor Xu said there may be a way to create energy efficient computing that works like the most complex computer of all, the human brain.

He said while industries were making inroads in driving down emissions and energy use, advanced large language models like ChatGPT could require as much electricity as up to 17,000 households.

“Future generations under development could consume even more,” he said.

Associate Professor Xu said according to the US Office of Energy Efficiency and Renewable Energy, data centres accounted for two percent of the United States’ total energy use.

He said in Australia, reports suggested data centres accounted for one percent of total energy use, potentially reaching eight percent by 2030.

“Large language models, like Open AIs, require large amounts of computational power to sift through vast troves of data.

“We’re meant to be scaling back our energy use, but the advent of large language models has been a shot in the arm and we’re seeing energy usage of computing soar.”

Associate Professor Xu said most of the time when people used large language models, like ChatGPT, they were making small queries or asking for help on pretty simple tasks.

“Yet these models still fire on all cylinders to develop a response, using increasing amounts of energy,” he said.

He said we needed to think about the human brain to understand how his technique worked.

“When you think about a healthy human brain – it doesn’t fire all neurons or use all of its brain power at once. It operates with incredibly energy efficiency, just 20 Watts of power despite having around 100 billion neurons, which it selectively uses from different hemispheres of the brain to perform different tasks or thinking.

“In contrast, advanced AI programs like ChatGPT, which contains 175 billion parameters, requires a staggering nine megawatts, equivalent to a medium-sized power station. This reminds us of the need to push the limits of machine intelligence, focusing not only on its accuracy but also on its efficiency.”

Associate Professor Xu said his team was developing algorithms that do just that.

He said they plan to bypass the redundant computations that were not needed and ensure the program didn’t automatically go into high gear, meaning far less energy was required.