Hybrid Prototype Contrastive Learning with Cross-Attention for Few-Shot Relation Classification

Authors: Zeyu Zhang, Shaowei Wang, Nana Bu, Junzhe Zhang, and Yuanyuan Xiao
Conference: ICIC 2025 Posters, Ningbo, China, July 26-29, 2025
Pages: 1928-1943
Keywords: Few-shot relation classification, Prototype network, Relation information, Cross-attention mechanism, Contrastive learning.

Abstract

Few-shot relation classification FSRC aims to identify the relation class be-tween entities in a text with a small amount of labeled data. Recently, some studies have focused on optimizing prototype representations by incorporat-ing relation information into the prototype network or applying contrastive learning to alleviate the prediction confusion problem. However, these ap-proaches primarily rely on global instance features and relation information, making it difficult to capture fine-grained local semantic information, leading to misjudgment of abnormal samples and confusion of similar classes. To address these problems, we introduce a novel hybrid prototype contrastive learning HPCL model. Dynamically fusing global and local prototypes through a cross-attention mechanism significantly improves the performance of few-shot relation classification. In addition, HPCL combines a dual con-trastive learning strategy relation-prototype contrastive learning and query-prototype contrastive learning to effectively enhance intra-class feature sharing and inter-class feature discriminability by optimizing prototype rep-resentation. We have conducted extensive experiments on the public datasets FewRel 1.0 and FewRel 2.0, and the results show that HPCL not only per-forms well on traditional datasets but also demonstrates a strong generaliza-tion ability in cross-domain adaptation tasks, which can effectively alleviate the challenges brought by data scarcity and insufficient relation description.
📄 View Full Paper (PDF) 📋 Show Citation