|国家预印本平台
首页|To Theoretically Understand Transformer-Based In-Context Learning for Optimizing CSMA

To Theoretically Understand Transformer-Based In-Context Learning for Optimizing CSMA

To Theoretically Understand Transformer-Based In-Context Learning for Optimizing CSMA

来源:Arxiv_logoArxiv
英文摘要

The binary exponential backoff scheme is widely used in WiFi 7 and still incurs poor throughput performance under dynamic channel environments. Recent model-based approaches (e.g., non-persistent and $p$-persistent CSMA) simply optimize backoff strategies under a known and fixed node density, still leading to a large throughput loss due to inaccurate node density estimation. This paper is the first to propose LLM transformer-based in-context learning (ICL) theory for optimizing channel access. We design a transformer-based ICL optimizer to pre-collect collision-threshold data examples and a query collision case. They are constructed as a prompt as the input for the transformer to learn the pattern, which then generates a predicted contention window threshold (CWT). To train the transformer for effective ICL, we develop an efficient algorithm and guarantee a near-optimal CWT prediction within limited training steps. As it may be hard to gather perfect data examples for ICL in practice, we further extend to allow erroneous data input in the prompt. We prove that our optimizer maintains minimal prediction and throughput deviations from the optimal values. Experimental results on NS-3 further demonstrate our approach's fast convergence and near-optimal throughput over existing model-based and DRL-based approaches under unknown node densities.

Shugang Hao、Hongbo Li、Lingjie Duan

通信无线通信

Shugang Hao,Hongbo Li,Lingjie Duan.To Theoretically Understand Transformer-Based In-Context Learning for Optimizing CSMA[EB/OL].(2025-08-19)[2025-08-24].https://arxiv.org/abs/2508.09146.点此复制

评论