WIND: Accelerated RNN-T Decoding with Windowed Inference for Non-blank Detection
WIND: Accelerated RNN-T Decoding with Windowed Inference for Non-blank Detection
We propose Windowed Inference for Non-blank Detection (WIND), a novel strategy that significantly accelerates RNN-T inference without compromising model accuracy. During model inference, instead of processing frames sequentially, WIND processes multiple frames simultaneously within a window in parallel, allowing the model to quickly locate non-blank predictions during decoding, resulting in significant speed-ups. We implement WIND for greedy decoding, batched greedy decoding with label-looping techniques, and also propose a novel beam-search decoding method. Experiments on multiple datasets with different conditions show that our method, when operating in greedy modes, speeds up as much as 2.4X compared to the baseline sequential approach while maintaining identical Word Error Rate (WER) performance. Our beam-search algorithm achieves slightly better accuracy than alternative methods, with significantly improved speed. We will open-source our WIND implementation.
Hainan Xu、Vladimir Bataev、Lilit Grigoryan、Boris Ginsburg
计算技术、计算机技术
Hainan Xu,Vladimir Bataev,Lilit Grigoryan,Boris Ginsburg.WIND: Accelerated RNN-T Decoding with Windowed Inference for Non-blank Detection[EB/OL].(2025-05-19)[2025-06-02].https://arxiv.org/abs/2505.13765.点此复制
评论