Exploiting Mention-Entity Graph to Enhance In-context Learning for Collective Entity Linking

Authors: Xingyi Li, Boyuan Jia, Mingshuo Chen, Xiang Cheng, and Sen Su
Conference: ICIC 2025 Posters, Ningbo, China, July 26-29, 2025
Pages: 979-993
Keywords: Collective entity linking, In-context learning, Mention-Entity graph, Large language models.

Abstract

Entity Linking EL aims at mapping mentions to their corresponding entities. It has been shown that in-context learning based approaches can provide better performance. However, they ignore the interdependency between different EL decisions, i.e., the mentions in the same document should be semantically related to each other, leading to inaccuracy of the task. In this paper, we present CIRCLE, a collective entity linking approach via Mention-Entity graph based in-context learning. In CIRCLE, we propose a logic enhanced path information injection method, which leverages comparative and additive logic to enhance the path information. Moreover, we design a submodular function based demonstration selection method which selects the document-level demonstrations considering high coverage of semantic and path information. Furthermore, we design a Tree-of-Thoughts based demonstration format method which uses a four-layer tree structure for hierarchical thinking. Experimental results confirm the effectiveness of our approach.
📄 View Full Paper (PDF) 📋 Show Citation