|国家预印本平台
首页|ChronoFormer: Time-Aware Transformer Architectures for Structured Clinical Event Modeling

ChronoFormer: Time-Aware Transformer Architectures for Structured Clinical Event Modeling

ChronoFormer: Time-Aware Transformer Architectures for Structured Clinical Event Modeling

来源:Arxiv_logoArxiv
英文摘要

The temporal complexity of electronic health record (EHR) data presents significant challenges for predicting clinical outcomes using machine learning. This paper proposes ChronoFormer, an innovative transformer based architecture specifically designed to encode and leverage temporal dependencies in longitudinal patient data. ChronoFormer integrates temporal embeddings, hierarchical attention mechanisms, and domain specific masking techniques. Extensive experiments conducted on three benchmark tasks mortality prediction, readmission prediction, and long term comorbidity onset demonstrate substantial improvements over current state of the art methods. Furthermore, detailed analyses of attention patterns underscore ChronoFormer's capability to capture clinically meaningful long range temporal relationships.

Yuanyun Zhang、Shi Li

医学研究方法临床医学

Yuanyun Zhang,Shi Li.ChronoFormer: Time-Aware Transformer Architectures for Structured Clinical Event Modeling[EB/OL].(2025-04-09)[2025-05-02].https://arxiv.org/abs/2504.07373.点此复制

评论