|国家预印本平台
首页|How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference

How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference

How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference

来源:Arxiv_logoArxiv
英文摘要

This paper introduces a novel infrastructure-aware benchmarking framework for quantifying the environmental footprint of LLM inference across 30 state-of-the-art models as deployed in commercial data centers. Our framework combines public API performance data with region-specific environmental multipliers and statistical inference of hardware configurations. We additionally utilize cross-efficiency Data Envelopment Analysis (DEA) to rank models by performance relative to environmental cost. Our results show that o3 and DeepSeek-R1 emerge as the most energy-intensive models, consuming over 33 Wh per long prompt, more than 70 times the consumption of GPT-4.1 nano, and that Claude-3.7 Sonnet ranks highest in eco-efficiency. While a single short GPT-4o query consumes 0.43 Wh, scaling this to 700 million queries/day results in substantial annual environmental impacts. These include electricity use comparable to 35,000 U.S. homes, freshwater evaporation matching the annual drinking needs of 1.2 million people, and carbon emissions requiring a Chicago-sized forest to offset. These findings illustrate a growing paradox: Although AI is becoming cheaper and faster, its global adoption drives disproportionate resource consumption. Our study provides a standardized, empirically grounded methodology for benchmarking the sustainability of LLM deployments, laying a foundation for future environmental accountability in AI development and sustainability standards.

Nidhal Jegham、Marwen Abdelatti、Lassad Elmoubarki、Abdeltawab Hendawi

能源动力工业经济环境科学技术现状环境质量管理

Nidhal Jegham,Marwen Abdelatti,Lassad Elmoubarki,Abdeltawab Hendawi.How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference[EB/OL].(2025-05-14)[2025-06-08].https://arxiv.org/abs/2505.09598.点此复制

评论