A Mixture of Experts (MoE) model to improve AI-based computational pathology prediction performance under variable levels of histopathology image blur
A Mixture of Experts (MoE) model to improve AI-based computational pathology prediction performance under variable levels of histopathology image blur
AI-based models for histopathology whole slide image (WSI) analysis are increasingly common, but unsharp or blurred areas within WSI can significantly reduce prediction performance. In this study, we investigated the effect of image blur on deep learning models and introduced a mixture of experts (MoE) strategy that combines predictions from multiple expert models trained on data with varying blur levels. Using H&E-stained WSIs from 2,093 breast cancer patients, we benchmarked performance on grade classification and IHC biomarker prediction with both CNN- (CNN_CLAM and MoE-CNN_CLAM) and Vision Transformer-based (UNI_CLAM and MoE-UNI_CLAM) models. Our results show that baseline models' performance consistently decreased with increasing blur, but expert models trained on blurred tiles and especially our proposed MoE approach substantially improved performance, and outperformed baseline models in a range of simulated scenarios. MoE-CNN_CLAM outperformed the baseline CNN_CLAM under moderate (AUC: 0.868 vs. 0.702) and mixed blur conditions (AUC: 0.890 vs. 0.875). MoE-UNI_CLAM outperformed the baseline UNI_CLAM model in both moderate (AUC: 0.950 vs. 0.928) and mixed blur conditions (AUC: 0.944 vs. 0.931). This MoE method has the potential to enhance the reliability of AI-based pathology models under variable image quality, supporting broader application in both research and clinical settings.
Yujie Xiang、Bojing Liu、Mattias Rantalainen
医学研究方法计算技术、计算机技术
Yujie Xiang,Bojing Liu,Mattias Rantalainen.A Mixture of Experts (MoE) model to improve AI-based computational pathology prediction performance under variable levels of histopathology image blur[EB/OL].(2025-07-18)[2025-08-06].https://arxiv.org/abs/2405.09298.点此复制
评论