EdgePrompt: A Distributed Key-Value Inference Framework for LLMs in 6G Networks
EdgePrompt: A Distributed Key-Value Inference Framework for LLMs in 6G Networks
As sixth-generation (6G) networks advance, large language models (LLMs) are increasingly integrated into 6G infrastructure to enhance network management and intelligence. However, traditional LLMs architecture struggle to meet the stringent latency and security requirements of 6G, especially as the increasing in sequence length leads to greater task complexity. This paper proposes Edge-Prompt, a cloud-edge collaborative framework based on a hierarchical attention splicing mechanism. EdgePrompt employs distributed key-value (KV) pair optimization techniques to accelerate inference and adapt to network conditions. Additionally, to reduce the risk of data leakage, EdgePrompt incorporates a privacy preserving strategy by isolating sensitive information during processing. Experiments on public dataset show that EdgePrompt effectively improves the inference throughput and reduces the latency, which provides a reliable solution for LLMs deployment in 6G environments.
Jiahong Ning、Pengyan Zhu、Ce Zheng、Gary Lee、Sumei Sun、Tingting Yang
通信无线通信
Jiahong Ning,Pengyan Zhu,Ce Zheng,Gary Lee,Sumei Sun,Tingting Yang.EdgePrompt: A Distributed Key-Value Inference Framework for LLMs in 6G Networks[EB/OL].(2025-04-15)[2025-06-07].https://arxiv.org/abs/2504.11729.点此复制
评论