|国家预印本平台
首页|Exploring Criteria of Loss Reweighting to Enhance LLM Unlearning

Exploring Criteria of Loss Reweighting to Enhance LLM Unlearning

Exploring Criteria of Loss Reweighting to Enhance LLM Unlearning

来源:Arxiv_logoArxiv
英文摘要

Loss reweighting has shown significant benefits for machine unlearning with large language models (LLMs). However, their exact functionalities are left unclear and the optimal strategy remains an open question, thus impeding the understanding and improvement of existing methodologies. In this paper, we identify two distinct goals of loss reweighting, namely, Saturation and Importance -- the former indicates that those insufficiently optimized data should be emphasized, while the latter stresses some critical data that are most influential for loss minimization. To study their usefulness, we design specific reweighting strategies for each goal and evaluate their respective effects on unlearning. We conduct extensive empirical analyses on well-established benchmarks, and summarize some important observations as follows: (i) Saturation enhances efficacy more than importance-based reweighting, and their combination can yield additional improvements. (ii) Saturation typically allocates lower weights to data with lower likelihoods, whereas importance-based reweighting does the opposite. (iii) The efficacy of unlearning is also largely influenced by the smoothness and granularity of the weight distributions. Based on these findings, we propose SatImp, a simple reweighting method that combines the advantages of both saturation and importance. Empirical results on extensive datasets validate the efficacy of our method, potentially bridging existing research gaps and indicating directions for future research. Our code is available at https://github.com/tmlr-group/SatImp.

Puning Yang、Qizhou Wang、Zhuo Huang、Tongliang Liu、Chengqi Zhang、Bo Han

计算技术、计算机技术

Puning Yang,Qizhou Wang,Zhuo Huang,Tongliang Liu,Chengqi Zhang,Bo Han.Exploring Criteria of Loss Reweighting to Enhance LLM Unlearning[EB/OL].(2025-05-17)[2025-07-20].https://arxiv.org/abs/2505.11953.点此复制

评论