Synergy: End-to-end Concept Model
Synergy: End-to-end Concept Model
In this paper, we present Synergy, a language model that bridges different levels of abstraction in an end-to-end fashion through a learned routing mechanism. Focusing on low-level linguistic abstraction, we trained our model as a byte-level language model. Our model spontaneously learns to tokenize bytes, producing fewer concept tokens than Byte-level Byte Pair Encoder (BBPE) tokenizers while keeping comparable performance. By comparing with Llama3, we observed an advantage of Synergy under the same model scale and training dataset size. Further studies show that the middle part (the higher abstraction part) of our model performs better when positional encodings are removed, suggesting the emergence of position-independent concepts. These findings demonstrate the feasibility of tokenizer-free architectures, paving the way for more robust and flexible pipelines.
Keli Zheng、Zerong Xie
语言学
Keli Zheng,Zerong Xie.Synergy: End-to-end Concept Model[EB/OL].(2025-07-17)[2025-08-10].https://arxiv.org/abs/2507.12769.点此复制
评论