|国家预印本平台
| 注册
首页|Leveraging Large Language Models for Generating Research Topic Ontologies: A Multi-Disciplinary Study

Leveraging Large Language Models for Generating Research Topic Ontologies: A Multi-Disciplinary Study

Leveraging Large Language Models for Generating Research Topic Ontologies: A Multi-Disciplinary Study

来源:Arxiv_logoArxiv
英文摘要

Ontologies and taxonomies of research fields are critical for managing and organising scientific knowledge, as they facilitate efficient classification, dissemination and retrieval of information. However, the creation and maintenance of such ontologies are expensive and time-consuming tasks, usually requiring the coordinated effort of multiple domain experts. Consequently, ontologies in this space often exhibit uneven coverage across different disciplines, limited inter-domain connectivity, and infrequent updating cycles. In this study, we investigate the capability of several large language models to identify semantic relationships among research topics within three academic domains: biomedicine, physics, and engineering. The models were evaluated under three distinct conditions: zero-shot prompting, chain-of-thought prompting, and fine-tuning on existing ontologies. Additionally, we assessed the cross-domain transferability of fine-tuned models by measuring their performance when trained in one domain and subsequently applied to a different one. To support this analysis, we introduce PEM-Rel-8K, a novel dataset consisting of over 8,000 relationships extracted from the most widely adopted taxonomies in the three disciplines considered in this study: MeSH, PhySH, and IEEE. Our experiments demonstrate that fine-tuning LLMs on PEM-Rel-8K yields excellent performance across all disciplines.

Tanay Aggarwal、Angelo Salatino、Francesco Osborne、Enrico Motta

科学、科学研究生物科学理论、生物科学方法物理学

Tanay Aggarwal,Angelo Salatino,Francesco Osborne,Enrico Motta.Leveraging Large Language Models for Generating Research Topic Ontologies: A Multi-Disciplinary Study[EB/OL].(2025-08-28)[2025-09-06].https://arxiv.org/abs/2508.20693.点此复制

评论