|国家预印本平台
首页|Green-LLM: Optimal Workload Allocation for Environmentally-Aware Distributed Inference

Green-LLM: Optimal Workload Allocation for Environmentally-Aware Distributed Inference

Green-LLM: Optimal Workload Allocation for Environmentally-Aware Distributed Inference

来源:Arxiv_logoArxiv
英文摘要

This letter investigates the optimal allocation of large language model (LLM) inference workloads across heterogeneous edge data centers (DCs) over time. Each DC features on-site renewable generation and faces dynamic electricity prices and spatiotemporal variability in renewable availability. The central question is: how can inference workloads be optimally distributed to the DCs to minimize energy consumption, carbon emissions, and water usage while enhancing user experience? This letter proposes a novel optimization model for LLM service providers to reduce operational costs and environmental impacts. Numerical results validate the efficacy of the proposed approach.

Jiaming Cheng、Duong Tung Nguyen

环境科学技术现状环境管理

Jiaming Cheng,Duong Tung Nguyen.Green-LLM: Optimal Workload Allocation for Environmentally-Aware Distributed Inference[EB/OL].(2025-07-14)[2025-07-22].https://arxiv.org/abs/2507.09942.点此复制

评论