|国家预印本平台
首页|Extending Memorization Dynamics in Pythia Models from Instance-Level Insights

Extending Memorization Dynamics in Pythia Models from Instance-Level Insights

Extending Memorization Dynamics in Pythia Models from Instance-Level Insights

来源:Arxiv_logoArxiv
英文摘要

Large language models have demonstrated a remarkable ability for verbatim memorization. While numerous works have explored factors influencing model memorization, the dynamic evolution memorization patterns remains underexplored. This paper presents a detailed analysis of memorization in the Pythia model family across varying scales and training steps under prefix perturbations. Using granular metrics, we examine how model architecture, data characteristics, and perturbations influence these patterns. Our findings reveal that: (1) as model scale increases, memorization expands incrementally while efficiency decreases rapidly; (2) as model scale increases, the rate of new memorization acquisition decreases while old memorization forgetting increases; (3) data characteristics (token frequency, repetition count, and uncertainty) differentially affect memorized versus non-memorized samples; and (4) prefix perturbations reduce memorization and increase generation uncertainty proportionally to perturbation strength, with low-redundancy samples showing higher vulnerability and larger models offering no additional robustness. These findings advance our understanding of memorization mechanisms, with direct implications for training optimization, privacy safeguards, and architectural improvements.

Jie Zhang、Qinghua Zhao、Lei Li、Chi-ho Lin

计算技术、计算机技术

Jie Zhang,Qinghua Zhao,Lei Li,Chi-ho Lin.Extending Memorization Dynamics in Pythia Models from Instance-Level Insights[EB/OL].(2025-06-13)[2025-06-22].https://arxiv.org/abs/2506.12321.点此复制

评论