|国家预印本平台
首页|Protap: A Benchmark for Protein Modeling on Realistic Downstream Applications

Protap: A Benchmark for Protein Modeling on Realistic Downstream Applications

Protap: A Benchmark for Protein Modeling on Realistic Downstream Applications

来源:Arxiv_logoArxiv
英文摘要

Recently, extensive deep learning architectures and pretraining strategies have been explored to support downstream protein applications. Additionally, domain-specific models incorporating biological knowledge have been developed to enhance performance in specialized tasks. In this work, we introduce $\textbf{Protap}$, a comprehensive benchmark that systematically compares backbone architectures, pretraining strategies, and domain-specific models across diverse and realistic downstream protein applications. Specifically, Protap covers five applications: three general tasks and two novel specialized tasks, i.e., enzyme-catalyzed protein cleavage site prediction and targeted protein degradation, which are industrially relevant yet missing from existing benchmarks. For each application, Protap compares various domain-specific models and general architectures under multiple pretraining settings. Our empirical studies imply that: (i) Though large-scale pretraining encoders achieve great results, they often underperform supervised encoders trained on small downstream training sets. (ii) Incorporating structural information during downstream fine-tuning can match or even outperform protein language models pretrained on large-scale sequence corpora. (iii) Domain-specific biological priors can enhance performance on specialized downstream tasks. Code and datasets are publicly available at https://github.com/Trust-App-AI-Lab/protap.

Shuo Yan、Yuliang Yan、Bin Ma、Chenao Li、Haochun Tang、Jiahua Lu、Minhua Lin、Yuyuan Feng、Hui Xiong、Enyan Dai

生物科学研究方法、生物科学研究技术

Shuo Yan,Yuliang Yan,Bin Ma,Chenao Li,Haochun Tang,Jiahua Lu,Minhua Lin,Yuyuan Feng,Hui Xiong,Enyan Dai.Protap: A Benchmark for Protein Modeling on Realistic Downstream Applications[EB/OL].(2025-06-01)[2025-06-30].https://arxiv.org/abs/2506.02052.点此复制

评论