|国家预印本平台
首页|Logit Reweighting for Topic-Focused Summarization

Logit Reweighting for Topic-Focused Summarization

Logit Reweighting for Topic-Focused Summarization

来源:Arxiv_logoArxiv
英文摘要

Generating abstractive summaries that adhere to a specific topic remains a significant challenge for language models. While standard approaches, such as fine-tuning, are resource-intensive, simpler methods like prompt engineering often struggle to maintain topical focus, particularly with smaller models. To address this, we propose a lightweight method that enhances topical relevance by directly reweighting the logits of topic-relevant tokens during generation. We evaluate three such reweighting techniques: Constant Shift, which adds a constant value to logits; Factor Scaling, which multiplies them by a factor; and Threshold Selection, which selectively boosts logits that exceed a probability threshold. Experiments on the NEWTS topical summarization dataset, using both Gemma-2B and Llama-3-8B models, show that these techniques effectively increase the use of topic-relevant vocabulary. Notably, the Threshold Selection method successfully improves topical focus without compromising summary quality-a trade-off often seen in other approaches. Our findings demonstrate that directly reweighting logits is a practical and resource-efficient alternative to fine-tuning, offering a promising pathway for precisely controlling the thematic content of generated text.

Joschka Braun、Bálint Mucsányi、Seyed Ali Bahrainian

计算技术、计算机技术

Joschka Braun,Bálint Mucsányi,Seyed Ali Bahrainian.Logit Reweighting for Topic-Focused Summarization[EB/OL].(2025-07-07)[2025-07-18].https://arxiv.org/abs/2507.05235.点此复制

评论