|国家预印本平台
首页|Relaxed syntax modeling in Transformers for future-proof license plate recognition

Relaxed syntax modeling in Transformers for future-proof license plate recognition

Relaxed syntax modeling in Transformers for future-proof license plate recognition

来源:Arxiv_logoArxiv
英文摘要

Effective license plate recognition systems are required to be resilient to constant change, as new license plates are released into traffic daily. While Transformer-based networks excel in their recognition at first sight, we observe significant performance drop over time which proves them unsuitable for tense production environments. Indeed, such systems obtain state-of-the-art results on plates whose syntax is seen during training. Yet, we show they perform similarly to random guessing on future plates where legible characters are wrongly recognized due to a shift in their syntax. After highlighting the flows of positional and contextual information in Transformer encoder-decoders, we identify several causes for their over-reliance on past syntax. Following, we devise architectural cut-offs and replacements which we integrate into SaLT, an attempt at a Syntax-Less Transformer for syntax-agnostic modeling of license plate representations. Experiments on both real and synthetic datasets show that our approach reaches top accuracy on past syntax and most importantly nearly maintains performance on future license plates. We further demonstrate the robustness of our architecture enhancements by way of various ablations.

Florent Meyer、Laurent Guichard、Denis Coquenet、Guillaume Gravier、Yann Soullard、Bertrand Co??asnon

计算技术、计算机技术

Florent Meyer,Laurent Guichard,Denis Coquenet,Guillaume Gravier,Yann Soullard,Bertrand Co??asnon.Relaxed syntax modeling in Transformers for future-proof license plate recognition[EB/OL].(2025-06-20)[2025-06-30].https://arxiv.org/abs/2506.17051.点此复制

评论