Adaptive Multi-prompt Contrastive Network for Few-shot Out-of-distribution Detection
Adaptive Multi-prompt Contrastive Network for Few-shot Out-of-distribution Detection
Out-of-distribution (OOD) detection attempts to distinguish outlier samples to prevent models trained on the in-distribution (ID) dataset from producing unavailable outputs. Most OOD detection methods require many IID samples for training, which seriously limits their real-world applications. To this end, we target a challenging setting: few-shot OOD detection, where {Only a few {\em labeled ID} samples are available.} Therefore, few-shot OOD detection is much more challenging than the traditional OOD detection setting. Previous few-shot OOD detection works ignore the distinct diversity between different classes. In this paper, we propose a novel network: Adaptive Multi-prompt Contrastive Network (AMCN), which adapts the ID-OOD separation boundary by learning inter- and intra-class distribution. To compensate for the absence of OOD and scarcity of ID {\em image samples}, we leverage CLIP, connecting text with images, engineering learnable ID and OOD {\em textual prompts}. Specifically, we first generate adaptive prompts (learnable ID prompts, label-fixed OOD prompts and label-adaptive OOD prompts). Then, we generate an adaptive class boundary for each class by introducing a class-wise threshold. Finally, we propose a prompt-guided ID-OOD separation module to control the margin between ID and OOD prompts. Experimental results show that AMCN outperforms other state-of-the-art works.
Xiang Fang、Arvind Easwaran、Blaise Genest
计算技术、计算机技术
Xiang Fang,Arvind Easwaran,Blaise Genest.Adaptive Multi-prompt Contrastive Network for Few-shot Out-of-distribution Detection[EB/OL].(2025-06-21)[2025-07-21].https://arxiv.org/abs/2506.17633.点此复制
评论