With boom of generative AI, researcher warns of energy costs

UPI
An image is generated from the prompt 'realistic oil painting of data servers consuming incredible amounts of electricity' by DALL-E. Image courtesy of Adam Schrader

Oct. 10 (UPI) -- Generative artificial intelligence technologies such as OpenAI's ChatGPT chatbot and the image creator Midjourney have boomed in the last year. But with that increase in computing needs comes a steep rise in energy costs that could bottleneck the supply chain for computer servers.

The warning was put forth by Alex de Vries, a Dutch researcher at Vrije Universiteit Amsterdam's School of Business and Economics, in a new commentary published in the academic journal Joule.

De Vries, writing in the academic version of an op-ed, noted that the accelerated development of AI raises concerns about the electricity consumption and potential environmental impact of the technology and data centers.

Large language models such as OpenAI's GPT-3, from which ChatGPT was developed as a specialized variant, are often considered to consume the most energy during the training phase when algorithms are fed large data sets.

Hugging Face reported that its BLOOM model consumed 433 megawatt-hours of electricity during training, enough to power 40 average American homes for a year. GPT-3, Gopher and OPT reportedly used 1,287 MWh, 1,066 MWh and 324 MWh, respectively, for training.

According to Mosharaf Chowdhury of the University of Michigan, 1,287 MWh is enough to supply an average U.S. household with electricity for 120 years.

However, de Vries challenged the idea that the models consume the most energy while training and said research supports that the interference phase of an AI model, when it begins production and generates outputs based on new data, might contribute "significantly" to an AI model's life-cycle costs -- citing data from NVIDIA and Google.

"Google's parent company, Alphabet, also expressed concern regarding the costs of inference compared to the costs of training," de Vries wrote. "However, contrasting data from Hugging Face indicates that the BLOOM model consumed significantly less energy during inference compared to the training phase."

For example, ChatGPT could cost 564 MWh of electricity a day to respond to 195 million requests.

Companies around the world are making a good effort to increase the efficiency of computer hardware and software for AI, but de Vries suggested that could end up increasing their demand, further affecting energy consumption in a phenomenon known as Jevons Paradox.

"The result of making these tools more efficient and accessible can be that we just allow more applications of it and more people to use it," de Vries said in a statement.

De Vries concluded that it is "too optimistic" to conclude that improvements in hardware and software efficiencies will fully offset any long-term changes in AI-related electricity consumption.

The researcher added it would be "advisable" for developers to critically consider whether the AI is needed in the first place.

"Regulators might consider introducing specific environmental disclosure requirements to enhance transparency across the AI supply chain, fostering a better understanding of the environmental costs of this emerging technological trend," de Vries wrote.