Ankh3: Multi-Task Pretraining with Sequence Denoising and Completion Enhances Protein Representations
Ankh3: Multi-Task Pretraining with Sequence Denoising and Completion Enhances Protein Representations
Protein language models (PLMs) have emerged as powerful tools to detect complex patterns of protein sequences. However, the capability of PLMs to fully capture information on protein sequences might be limited by focusing on single pre-training tasks. Although adding data modalities or supervised objectives can improve the performance of PLMs, pre-training often remains focused on denoising corrupted sequences. To push the boundaries of PLMs, our research investigated a multi-task pre-training strategy. We developed Ankh3, a model jointly optimized on two objectives: masked language modeling with multiple masking probabilities and protein sequence completion relying only on protein sequences as input. This multi-task pre-training demonstrated that PLMs can learn richer and more generalizable representations solely from protein sequences. The results demonstrated improved performance in downstream tasks, such as secondary structure prediction, fluorescence, GB1 fitness, and contact prediction. The integration of multiple tasks gave the model a more comprehensive understanding of protein properties, leading to more robust and accurate predictions.
Hazem Alsamkary、Mohamed Elshaffei、Mohamed Elkerdawy、Ahmed Elnaggar
生物科学研究方法、生物科学研究技术生物科学理论、生物科学方法
Hazem Alsamkary,Mohamed Elshaffei,Mohamed Elkerdawy,Ahmed Elnaggar.Ankh3: Multi-Task Pretraining with Sequence Denoising and Completion Enhances Protein Representations[EB/OL].(2025-05-26)[2025-06-17].https://arxiv.org/abs/2505.20052.点此复制
评论