Impact of SMILES Notational Inconsistencies on Chemical Language Model Performance
Impact of SMILES Notational Inconsistencies on Chemical Language Model Performance
Chemical language models (CLMs), inspired by natural language processing (NLP), have recently emerged as powerful tools for various cheminformatics tasks such as molecular property prediction and molecule generation. The simplified molecular input line entry system (SMILES) is commonly employed in CLMs for representing molecular structures. However, despite attempts at standardization through Canonical SMILES, significant representational inconsistencies remain due to variations in canonicalization methods (grammatical inconsistencies) and incomplete stereochemical annotations (stereochemical inconsistencies). This study systematically investigates the prevalence and impact of these inconsistencies. Our literature review reveals that nearly half (45.4%) of the studies employing CLMs omit explicit mention of SMILES canonicalization, potentially impairing reproducibility. Through quantitative analysis of publicly available datasets, we observed substantial variability in SMILES representations, particularly highlighting significant gaps in stereochemical information approximately 50% of enantiomers and 30% of geometric isomers lacked complete annotations. Empirical evaluations using an encoder-decoder CLM demonstrated that representational variations significantly affect latent molecular representations, notably reducing translation accuracy. Interestingly, these variations minimally impacted downstream property prediction tasks, likely due to robust feature selection driven by label information. Furthermore, explicit manipulation of stereochemical annotations confirmed their crucial role in accurate cyclic structure reconstruction.
Yosuke Kikuchi、Tadahaya Mizuno、Yasuhiro Yoshikai、Shumpei Nemoto、Hiroyuki Kusuhara
化学
Yosuke Kikuchi,Tadahaya Mizuno,Yasuhiro Yoshikai,Shumpei Nemoto,Hiroyuki Kusuhara.Impact of SMILES Notational Inconsistencies on Chemical Language Model Performance[EB/OL].(2025-05-11)[2025-06-05].https://arxiv.org/abs/2505.07139.点此复制
评论