Large Language Models‘ Carbon ⁢Footprint Shows ⁤Accuracy-Sustainability Tradeoff

Updated June 19, 2025

Large language models, or LLMs, are increasingly prevalent, but their energy demands raise environmental concerns. A recent study indicates some LLMs generate up to⁢ 50 times⁢ more carbon emissions than others when processing⁢ queries. The research, ⁢published in Frontiers in Communication, suggests that higher⁢ accuracy in LLMs often correlates with greater energy consumption, creating a carbon footprint dilemma.

Researchers at⁢ Hochschule München ‍University of Applied Sciences evaluated 14 LLMs, ranging from 7 billion to 72 billion parameters,‍ using 1,000 benchmark questions.The​ study focused⁢ on how LLMs convert prompts into tokens-numerical ​representations of words-and the energy used in subsequent computations. Reasoning models, which insert “thinking tokens” for internal processing, averaged 543.5 tokens per question, while concise models used​ only 37.7.

Maximilian Dauner, a researcher at​ Hochschule München University of Applied‌ Sciences, said ⁤the ‌environmental impact depends heavily on the reasoning ‌approach.He added that reasoning-enabled models produced up to 50 ‍times more carbon dioxide emissions than ​concise response models.

The ‍study also revealed a trade-off between accuracy and sustainability. The Cogito model, with 70 ⁣billion‌ parameters, achieved 84.9% accuracy but produced three times more carbon dioxide ⁢emissions than similar models. Dauner noted that models with emissions below 500 grams of carbon dioxide equivalent struggled to exceed 80% accuracy.

Subject matter also plays a role. Complex questions, such as​ those involving abstract algebra or philosophy, resulted in⁤ up to six times higher ⁤emissions compared to⁣ straightforward topics.

What’s next

while acknowledging that emissions depend on local energy grids and specific⁢ models, the study’s authors hope their findings will encourage users to be more selective in their LLM ⁣use. Dauner suggests that users can significantly reduce emissions⁣ by prompting AI for concise answers ‌or reserving high-capacity models for tasks that genuinely require their power.