|国家预印本平台
首页|MolX: Enhancing Large Language Models for Molecular Understanding With A Multi-Modal Extension

MolX: Enhancing Large Language Models for Molecular Understanding With A Multi-Modal Extension

MolX: Enhancing Large Language Models for Molecular Understanding With A Multi-Modal Extension

来源:Arxiv_logoArxiv
英文摘要

Large Language Models (LLMs) with their strong task-handling capabilities have shown remarkable advancements across a spectrum of fields, moving beyond natural language understanding. However, their proficiency within the chemistry domain remains restricted, especially in solving molecule-related tasks. This challenge is attributed to their inherent limitations in comprehending molecules using only common textual representations, i.e. SMILES strings. In this study, we seek to enhance the ability of LLMs to comprehend molecules by equipping them with a multi-modal external module, termed MolX. Instead of directly using SMILES strings to represent a molecule, we utilize specific encoders to extract fine-grained features from both SMILES string and 2D molecular graph representations for feeding into an LLM. A hand-crafted molecular fingerprint is incorporated to leverage its embedded domain knowledge. To establish an alignment between MolX and the LLM's textual input space, the model in which the LLM is frozen, is pre-trained with a strategy including a diverse set of tasks. Experimental evaluations show that our proposed method outperforms baselines across 4 downstream molecule-related tasks ranging from molecule-to-text translation to retrosynthesis, with and without fine-tuning the LLM, while only introducing a small number of trainable parameters--0.53\% and 0.82\%, respectively.

Khiem Le、Zhichun Guo、Kaiwen Dong、Xiaobao Huang、Bozhao Nan、Roshni Iyer、Xiangliang Zhang、Olaf Wiest、Wei Wang、Ting Hua、Nitesh V. Chawla

化学

Khiem Le,Zhichun Guo,Kaiwen Dong,Xiaobao Huang,Bozhao Nan,Roshni Iyer,Xiangliang Zhang,Olaf Wiest,Wei Wang,Ting Hua,Nitesh V. Chawla.MolX: Enhancing Large Language Models for Molecular Understanding With A Multi-Modal Extension[EB/OL].(2025-07-07)[2025-07-19].https://arxiv.org/abs/2406.06777.点此复制

评论