Three big things we still don’t know about AI’s energy burden
Summary
Researchers chased a simple but crucial number: how much energy leading AI models use to generate a single response. After months of opacity, companies this summer released per-query estimates—OpenAI (0.34 watt‑hours) and Google (about 0.24 watt‑hours)—but those figures are incomplete and focused on chat interactions only.
The article highlights three major unknowns: the published numbers are vague and limited to chat, claims about AI improving overall energy efficiency remain unproven, and the biggest uncertainty is whether AI usage will grow to the scale data‑centre buildouts assume or whether demand will plateau or crash, leaving stranded infrastructure.
Source
Key Points
- Per‑query energy estimates were published by Big Tech this summer (OpenAI ~0.34 Wh; Google ~0.24 Wh) but lack technical detail and transparency.
- Published figures are largely chat‑only and don’t cover more energy‑intensive modalities like image or video generation.
- Company disclosures omit critical details: which model, measurement methods, variance for long or reasoning‑intensive responses.
- Claims that AI will pay back its carbon cost by enabling efficiencies are mostly anecdotal and not yet proven at scale.
- Data‑centre expansions continue even as evidence is limited that AI use will grow to match that capacity.
- A key risk is demand: if usage reaches projected levels, AI could dramatically raise electricity consumption; if it doesn’t, recent investments may be wasted.
- Researchers and analysts call for far greater transparency from companies, given the unusual pace of data‑centre growth and potential systemic effects on grids and emissions.
Why should I read this?
Because it cuts straight to the hole in the story: yes, we finally have per‑query numbers, but they don’t tell the half of it. If you care about climate policy, energy infrastructure or where Big Tech is placing huge bets, this explains the three things we still don’t know—and why those unknowns matter. Short version: the per‑chat energy cost looks tiny, but the scale, modalities and corporate opacity change the story fast.
Context and relevance
The piece sits at the intersection of AI hype, corporate sustainability claims and grid planning. Forecasts suggest AI could consume electricity equivalent to a large fraction of US households by 2028 if usage scales as some models assume. Policymakers, utilities and climate analysts need better data to judge whether today’s data‑centre buildout is a durable shift or a boom that could leave stranded assets and higher emissions.