|国家预印本平台
首页|EOOD: Entropy-based Out-of-distribution Detection

EOOD: Entropy-based Out-of-distribution Detection

EOOD: Entropy-based Out-of-distribution Detection

来源:Arxiv_logoArxiv
英文摘要

Deep neural networks (DNNs) often exhibit overconfidence when encountering out-of-distribution (OOD) samples, posing significant challenges for deployment. Since DNNs are trained on in-distribution (ID) datasets, the information flow of ID samples through DNNs inevitably differs from that of OOD samples. In this paper, we propose an Entropy-based Out-Of-distribution Detection (EOOD) framework. EOOD first identifies specific block where the information flow differences between ID and OOD samples are more pronounced, using both ID and pseudo-OOD samples. It then calculates the conditional entropy on the selected block as the OOD confidence score. Comprehensive experiments conducted across various ID and OOD settings demonstrate the effectiveness of EOOD in OOD detection and its superiority over state-of-the-art methods.

Guide Yang、Chao Hou、Weilong Peng、Xiang Fang、Yongwei Nie、Peican Zhu、Keke Tang

计算技术、计算机技术

Guide Yang,Chao Hou,Weilong Peng,Xiang Fang,Yongwei Nie,Peican Zhu,Keke Tang.EOOD: Entropy-based Out-of-distribution Detection[EB/OL].(2025-04-04)[2025-04-24].https://arxiv.org/abs/2504.03342.点此复制

评论