Modular Delta Merging with Orthogonal Constraints: A Scalable Framework for Continual and Reversible Model Composition
Modular Delta Merging with Orthogonal Constraints: A Scalable Framework for Continual and Reversible Model Composition
In real-world machine learning deployments, models must be continually updated, composed, and when required, selectively undone. However, existing approaches to model merging and continual learning often suffer from task interference, catastrophic forgetting, or lack of reversibility. We propose Modular Delta Merging with Orthogonal Constraints (MDM-OC), a novel framework that enables scalable, interference-free, and reversible composition of fine-tuned models. Each task-specific model is encoded as a delta from a shared base and projected into an orthogonal subspace to eliminate conflict. These projected deltas are then merged via gradient-based optimization to form a unified model that retains performance across tasks. Our approach supports continual integration of new models, structured unmerging for compliance such as GDPR requirements, and model stability via elastic weight consolidation and synthetic replay. Extensive experiments on vision and natural language processing benchmarks demonstrate that MDM-OC outperforms prior baselines in accuracy, backward transfer, and unmerge fidelity, while remaining memory-efficient and computationally tractable. This framework offers a principled solution for modular and compliant AI system design.
Haris Khan、Shumaila Asif、Sadia Asif
计算技术、计算机技术
Haris Khan,Shumaila Asif,Sadia Asif.Modular Delta Merging with Orthogonal Constraints: A Scalable Framework for Continual and Reversible Model Composition[EB/OL].(2025-07-28)[2025-08-06].https://arxiv.org/abs/2507.20997.点此复制
评论