|国家预印本平台
首页|Let's Put Ourselves in Sally's Shoes: Shoes-of-Others Prefixing Improves Theory of Mind in Large Language Models

Let's Put Ourselves in Sally's Shoes: Shoes-of-Others Prefixing Improves Theory of Mind in Large Language Models

Let's Put Ourselves in Sally's Shoes: Shoes-of-Others Prefixing Improves Theory of Mind in Large Language Models

来源:Arxiv_logoArxiv
英文摘要

Recent studies have shown that Theory of Mind (ToM) in large language models (LLMs) has not reached human-level performance yet. Since fine-tuning LLMs on ToM datasets often degrades their generalization, several inference-time methods have been proposed to enhance ToM in LLMs. However, existing inference-time methods for ToM are specialized for inferring beliefs from contexts involving changes in the world state. In this study, we present a new inference-time method for ToM, Shoes-of-Others (SoO) prefixing, which makes fewer assumptions about contexts and is applicable to broader scenarios. SoO prefixing simply specifies the beginning of LLM outputs with ``Let's put ourselves in A's shoes.'', where A denotes the target character's name. We evaluate SoO prefixing on two benchmarks that assess ToM in conversational and narrative contexts without changes in the world state and find that it consistently improves ToM across five categories of mental states. Our analysis suggests that SoO prefixing elicits faithful thoughts, thereby improving the ToM performance.

Kazutoshi Shinoda、Nobukatsu Hojo、Kyosuke Nishida、Yoshihiro Yamazaki、Keita Suzuki、Hiroaki Sugiyama、Kuniko Saito

计算技术、计算机技术

Kazutoshi Shinoda,Nobukatsu Hojo,Kyosuke Nishida,Yoshihiro Yamazaki,Keita Suzuki,Hiroaki Sugiyama,Kuniko Saito.Let's Put Ourselves in Sally's Shoes: Shoes-of-Others Prefixing Improves Theory of Mind in Large Language Models[EB/OL].(2025-06-06)[2025-07-16].https://arxiv.org/abs/2506.05970.点此复制

评论