Skip to main content

AI’s Energy Appetite Is Bigger Than You Think

Sizing Up the Power Behind the Prompts

A UK government-commissioned report has tackled the daunting task of estimating the energy consumption of artificial intelligence systems—and while the verdict is far from definitive, one thing is clear: AI is guzzling power at an accelerating pace. Conducted by innovation agency Nesta and environmental think tank Green Alliance, the research delves into the environmental cost of training and deploying large AI models, something experts say has remained an opaque but vital issue. With estimates showing a single large model consuming as much electricity as 100 U.K. homes in a year, the report sheds new light on the complex challenge of measuring and mitigating AI’s environmental footprint.

Data Deserts, Energy Black Boxes

The report highlights the major obstacle in understanding AI’s true energy footprint: a lack of reliable, public data from the companies building these systems. Most major AI developers—such as OpenAI, Google, and Meta—do not disclose detailed information on energy use, making precise comparisons difficult. Policy makers and researchers are calling for more transparency to help evaluate whether current trends are sustainable. Without standardized reporting or accountability frameworks, assessing the climate impact of generative AI and large language models remains an exercise in informed guesswork. Still, the report signals a growing push to address these gaps before AI’s energy scales spiral beyond control.

BytesWall

BytesWall brings you smart, byte-sized updates and deep industry insights on AI, automation, tech, and innovation — built for today's tech-driven world.

Related Articles