FastFace: Tuning Identity Preservation in Distilled Diffusion via Guidance and Attention
FastFace: Tuning Identity Preservation in Distilled Diffusion via Guidance and Attention
In latest years plethora of identity-preserving adapters for a personalized generation with diffusion models have been released. Their main disadvantage is that they are dominantly trained jointly with base diffusion models, which suffer from slow multi-step inference. This work aims to tackle the challenge of training-free adaptation of pretrained ID-adapters to diffusion models accelerated via distillation - through careful re-design of classifier-free guidance for few-step stylistic generation and attention manipulation mechanisms in decoupled blocks to improve identity similarity and fidelity, we propose universal FastFace framework. Additionally, we develop a disentangled public evaluation protocol for id-preserving adapters.
Sergey Karpukhin、Aibek Alanov、Andrey Kuznetsov、Vadim Titov
计算技术、计算机技术
Sergey Karpukhin,Aibek Alanov,Andrey Kuznetsov,Vadim Titov.FastFace: Tuning Identity Preservation in Distilled Diffusion via Guidance and Attention[EB/OL].(2025-05-27)[2025-06-13].https://arxiv.org/abs/2505.21144.点此复制
评论