Human-Centered Explainability in Interactive Information Systems: A Survey
Human-Centered Explainability in Interactive Information Systems: A Survey
Human-centered explainability has become a critical foundation for the responsible development of interactive information systems, where users must be able to understand, interpret, and scrutinize AI-driven outputs to make informed decisions. This systematic survey of literature aims to characterize recent progress in user studies on explainability in interactive information systems by reviewing how explainability has been conceptualized, designed, and evaluated in practice. Following PRISMA guidelines, eight academic databases were searched, and 100 relevant articles were identified. A structural encoding approach was then utilized to extract and synthesize insights from these articles. The main contributions include 1) five dimensions that researchers have used to conceptualize explainability; 2) a classification scheme of explanation designs; 3) a categorization of explainability measurements into six user-centered dimensions. The review concludes by reflecting on ongoing challenges and providing recommendations for future exploration of related issues. The findings shed light on the theoretical foundations of human-centered explainability, informing the design of interactive information systems that better align with diverse user needs and promoting the development of systems that are transparent, trustworthy, and accountable.
Yuhao Zhang、Jiaxin An、Ben Wang、Yan Zhang、Jiqun Liu
计算技术、计算机技术
Yuhao Zhang,Jiaxin An,Ben Wang,Yan Zhang,Jiqun Liu.Human-Centered Explainability in Interactive Information Systems: A Survey[EB/OL].(2025-07-03)[2025-07-16].https://arxiv.org/abs/2507.02300.点此复制
评论