|国家预印本平台
首页|Let Me Grok for You: Accelerating Grokking via Embedding Transfer from a Weaker Model

Let Me Grok for You: Accelerating Grokking via Embedding Transfer from a Weaker Model

Let Me Grok for You: Accelerating Grokking via Embedding Transfer from a Weaker Model

来源:Arxiv_logoArxiv
英文摘要

''Grokking'' is a phenomenon where a neural network first memorizes training data and generalizes poorly, but then suddenly transitions to near-perfect generalization after prolonged training. While intriguing, this delayed generalization phenomenon compromises predictability and efficiency. Ideally, models should generalize directly without delay. To this end, this paper proposes GrokTransfer, a simple and principled method for accelerating grokking in training neural networks, based on the key observation that data embedding plays a crucial role in determining whether generalization is delayed. GrokTransfer first trains a smaller, weaker model to reach a nontrivial (but far from optimal) test performance. Then, the learned input embedding from this weaker model is extracted and used to initialize the embedding in the target, stronger model. We rigorously prove that, on a synthetic XOR task where delayed generalization always occurs in normal training, GrokTransfer enables the target model to generalize directly without delay. Moreover, we demonstrate that, across empirical studies of different tasks, GrokTransfer effectively reshapes the training dynamics and eliminates delayed generalization, for both fully-connected neural networks and Transformers.

Zhiwei Xu、Zhiyu Ni、Yixin Wang、Wei Hu

计算技术、计算机技术

Zhiwei Xu,Zhiyu Ni,Yixin Wang,Wei Hu.Let Me Grok for You: Accelerating Grokking via Embedding Transfer from a Weaker Model[EB/OL].(2025-04-17)[2025-05-18].https://arxiv.org/abs/2504.13292.点此复制

评论