|国家预印本平台
首页|From Tokens to Lattices: Emergent Lattice Structures in Language Models

From Tokens to Lattices: Emergent Lattice Structures in Language Models

From Tokens to Lattices: Emergent Lattice Structures in Language Models

来源:Arxiv_logoArxiv
英文摘要

Pretrained masked language models (MLMs) have demonstrated an impressive capability to comprehend and encode conceptual knowledge, revealing a lattice structure among concepts. This raises a critical question: how does this conceptualization emerge from MLM pretraining? In this paper, we explore this problem from the perspective of Formal Concept Analysis (FCA), a mathematical framework that derives concept lattices from the observations of object-attribute relationships. We show that the MLM's objective implicitly learns a \emph{formal context} that describes objects, attributes, and their dependencies, which enables the reconstruction of a concept lattice through FCA. We propose a novel framework for concept lattice construction from pretrained MLMs and investigate the origin of the inductive biases of MLMs in lattice structure learning. Our framework differs from previous work because it does not rely on human-defined concepts and allows for discovering "latent" concepts that extend beyond human definitions. We create three datasets for evaluation, and the empirical results verify our hypothesis.

Bo Xiong、Steffen Staab

计算技术、计算机技术

Bo Xiong,Steffen Staab.From Tokens to Lattices: Emergent Lattice Structures in Language Models[EB/OL].(2025-04-04)[2025-05-23].https://arxiv.org/abs/2504.08778.点此复制

评论