GPTKB v1.5: A Massive Knowledge Base for Exploring Factual LLM Knowledge
GPTKB v1.5: A Massive Knowledge Base for Exploring Factual LLM Knowledge
Language models are powerful tools, yet their factual knowledge is still poorly understood, and inaccessible to ad-hoc browsing and scalable statistical analysis. This demonstration introduces GPTKB v1.5, a densely interlinked 100-million-triple knowledge base (KB) built for $14,000 from GPT-4.1, using the GPTKB methodology for massive-recursive LLM knowledge materialization (Hu et al., ACL 2025). The demonstration experience focuses on three use cases: (1) link-traversal-based LLM knowledge exploration, (2) SPARQL-based structured LLM knowledge querying, (3) comparative exploration of the strengths and weaknesses of LLM knowledge. Massive-recursive LLM knowledge materialization is a groundbreaking opportunity both for the research area of systematic analysis of LLM knowledge, as well as for automated KB construction. The GPTKB demonstrator is accessible at https://gptkb.org.
Yujia Hu、Tuan-Phong Nguyen、Shrestha Ghosh、Moritz Müller、Simon Razniewski
计算技术、计算机技术
Yujia Hu,Tuan-Phong Nguyen,Shrestha Ghosh,Moritz Müller,Simon Razniewski.GPTKB v1.5: A Massive Knowledge Base for Exploring Factual LLM Knowledge[EB/OL].(2025-07-08)[2025-08-02].https://arxiv.org/abs/2507.05740.点此复制
评论