|国家预印本平台
首页|FastLongSpeech: Enhancing Large Speech-Language Models for Efficient Long-Speech Processing

FastLongSpeech: Enhancing Large Speech-Language Models for Efficient Long-Speech Processing

FastLongSpeech: Enhancing Large Speech-Language Models for Efficient Long-Speech Processing

来源:Arxiv_logoArxiv
英文摘要

The rapid advancement of Large Language Models (LLMs) has spurred significant progress in Large Speech-Language Models (LSLMs), enhancing their capabilities in both speech understanding and generation. While existing LSLMs often concentrate on augmenting speech generation or tackling a diverse array of short-speech tasks, the efficient processing of long-form speech remains a critical yet underexplored challenge. This gap is primarily attributed to the scarcity of long-speech training datasets and the high computational costs associated with long sequences. To address these limitations, we introduce FastLongSpeech, a novel framework designed to extend LSLM capabilities for efficient long-speech processing without necessitating dedicated long-speech training data. FastLongSpeech incorporates an iterative fusion strategy that can compress excessively long-speech sequences into manageable lengths. To adapt LSLMs for long-speech inputs, it introduces a dynamic compression training approach, which exposes the model to short-speech sequences at varying compression ratios, thereby transferring the capabilities of LSLMs to long-speech tasks. To assess the long-speech capabilities of LSLMs, we develop a long-speech understanding benchmark called LongSpeech-Eval. Experiments show that our method exhibits strong performance in both long-speech and short-speech tasks, while greatly improving inference efficiency.

Shoutao Guo、Shaolei Zhang、Qingkai Fang、Zhengrui Ma、Min Zhang、Yang Feng

计算技术、计算机技术

Shoutao Guo,Shaolei Zhang,Qingkai Fang,Zhengrui Ma,Min Zhang,Yang Feng.FastLongSpeech: Enhancing Large Speech-Language Models for Efficient Long-Speech Processing[EB/OL].(2025-07-20)[2025-08-16].https://arxiv.org/abs/2507.14815.点此复制

评论