|国家预印本平台
首页|Adversarial Data Augmentation for Single Domain Generalization via Lyapunov Exponent-Guided Optimization

Adversarial Data Augmentation for Single Domain Generalization via Lyapunov Exponent-Guided Optimization

Adversarial Data Augmentation for Single Domain Generalization via Lyapunov Exponent-Guided Optimization

来源:Arxiv_logoArxiv
英文摘要

Single Domain Generalization (SDG) aims to develop models capable of generalizing to unseen target domains using only one source domain, a task complicated by substantial domain shifts and limited data diversity. Existing SDG approaches primarily rely on data augmentation techniques, which struggle to effectively adapt training dynamics to accommodate large domain shifts. To address this, we propose LEAwareSGD, a novel Lyapunov Exponent (LE)-guided optimization approach inspired by dynamical systems theory. By leveraging LE measurements to modulate the learning rate, LEAwareSGD encourages model training near the edge of chaos, a critical state that optimally balances stability and adaptability. This dynamic adjustment allows the model to explore a wider parameter space and capture more generalizable features, ultimately enhancing the model's generalization capability. Extensive experiments on PACS, OfficeHome, and DomainNet demonstrate that LEAwareSGD yields substantial generalization gains, achieving up to 9.47\% improvement on PACS in low-data regimes. These results underscore the effectiveness of training near the edge of chaos for enhancing model generalization capability in SDG tasks.

Zuyu Zhang、Ning Chen、Yongshan Liu、Qinghua Zhang、Xu Zhang

计算技术、计算机技术

Zuyu Zhang,Ning Chen,Yongshan Liu,Qinghua Zhang,Xu Zhang.Adversarial Data Augmentation for Single Domain Generalization via Lyapunov Exponent-Guided Optimization[EB/OL].(2025-07-06)[2025-07-16].https://arxiv.org/abs/2507.04302.点此复制

评论