AI processing could consume ‘as much electricity as Ireland’

Should we be worried about the increasing power demands of AI chatbots? According to The Register, it might be time:

The recent spike of interest in AI thanks to large language models (LLMs) and generative AI is pushing adoption of the tech by a wide variety of applications, leading to worries the processing needed for this will cause a surge in datacenter electricity consumption.

These concerns are raised in a paper by Alex de Vries, a researcher at the Vrije Universiteit Amsterdam.

In the paper, De Vries notes people have focused on the training phase of AI models when researching the sustainability of AI, because this is generally considered to be the most resource-intensive, and therefore the most energy consuming.

However, relatively little attention is paid to the inference phase, he argues, yet there are indications that inferencing – operating the trained model – might contribute significantly to an AI model’s life-cycle costs.

Read the article here.

Leave a comment