Multimodal Cancer Survival Analysis via Hypergraph Learning with Cross-Modality Rebalance
Multimodal Cancer Survival Analysis via Hypergraph Learning with Cross-Modality Rebalance
Multimodal pathology-genomic analysis has become increasingly prominent in cancer survival prediction. However, existing studies mainly utilize multi-instance learning to aggregate patch-level features, neglecting the information loss of contextual and hierarchical details within pathology images. Furthermore, the disparity in data granularity and dimensionality between pathology and genomics leads to a significant modality imbalance. The high spatial resolution inherent in pathology data renders it a dominant role while overshadowing genomics in multimodal integration. In this paper, we propose a multimodal survival prediction framework that incorporates hypergraph learning to effectively capture both contextual and hierarchical details from pathology images. Moreover, it employs a modality rebalance mechanism and an interactive alignment fusion strategy to dynamically reweight the contributions of the two modalities, thereby mitigating the pathology-genomics imbalance. Quantitative and qualitative experiments are conducted on five TCGA datasets, demonstrating that our model outperforms advanced methods by over 3.4\% in C-Index performance.
Mingcheng Qu、Guang Yang、Donglin Di、Tonghua Su、Yue Gao、Yang Song、Lei Fan
肿瘤学医学研究方法生物科学研究方法、生物科学研究技术
Mingcheng Qu,Guang Yang,Donglin Di,Tonghua Su,Yue Gao,Yang Song,Lei Fan.Multimodal Cancer Survival Analysis via Hypergraph Learning with Cross-Modality Rebalance[EB/OL].(2025-05-17)[2025-07-16].https://arxiv.org/abs/2505.11997.点此复制
评论