|国家预印本平台
| 注册
首页|FairLoop: Software Support for Human-Centric Fairness in Predictive Business Process Monitoring

FairLoop: Software Support for Human-Centric Fairness in Predictive Business Process Monitoring

FairLoop: Software Support for Human-Centric Fairness in Predictive Business Process Monitoring

来源:Arxiv_logoArxiv
英文摘要

Sensitive attributes like gender or age can lead to unfair predictions in machine learning tasks such as predictive business process monitoring, particularly when used without considering context. We present FairLoop1, a tool for human-guided bias mitigation in neural network-based prediction models. FairLoop distills decision trees from neural networks, allowing users to inspect and modify unfair decision logic, which is then used to fine-tune the original model towards fairer predictions. Compared to other approaches to fairness, FairLoop enables context-aware bias removal through human involvement, addressing the influence of sensitive attributes selectively rather than excluding them uniformly.

Felix Möhrlein、Martin Käppel、Julian Neuberger、Sven Weinzierl、Lars Ackermann、Martin Matzner、Stefan Jablonski

计算技术、计算机技术

Felix Möhrlein,Martin Käppel,Julian Neuberger,Sven Weinzierl,Lars Ackermann,Martin Matzner,Stefan Jablonski.FairLoop: Software Support for Human-Centric Fairness in Predictive Business Process Monitoring[EB/OL].(2025-08-27)[2025-09-06].https://arxiv.org/abs/2508.20021.点此复制

评论