Minimalist Concept Erasure in Generative Models
Minimalist Concept Erasure in Generative Models
Recent advances in generative models have demonstrated remarkable capabilities in producing high-quality images, but their reliance on large-scale unlabeled data has raised significant safety and copyright concerns. Efforts to address these issues by erasing unwanted concepts have shown promise. However, many existing erasure methods involve excessive modifications that compromise the overall utility of the model. In this work, we address these issues by formulating a novel minimalist concept erasure objective based \emph{only} on the distributional distance of final generation outputs. Building on our formulation, we derive a tractable loss for differentiable optimization that leverages backpropagation through all generation steps in an end-to-end manner. We also conduct extensive analysis to show theoretical connections with other models and methods. To improve the robustness of the erasure, we incorporate neuron masking as an alternative to model fine-tuning. Empirical evaluations on state-of-the-art flow-matching models demonstrate that our method robustly erases concepts without degrading overall model performance, paving the way for safer and more responsible generative models.
Yang Zhang、Er Jin、Yanfei Dong、Yixuan Wu、Philip Torr、Ashkan Khakzar、Johannes Stegmaier、Kenji Kawaguchi
计算技术、计算机技术
Yang Zhang,Er Jin,Yanfei Dong,Yixuan Wu,Philip Torr,Ashkan Khakzar,Johannes Stegmaier,Kenji Kawaguchi.Minimalist Concept Erasure in Generative Models[EB/OL].(2025-07-16)[2025-08-18].https://arxiv.org/abs/2507.13386.点此复制
评论