|国家预印本平台
首页|Cross-Domain Few-Shot Relation Extraction via Representation Learning and Domain Adaptation

Cross-Domain Few-Shot Relation Extraction via Representation Learning and Domain Adaptation

Cross-Domain Few-Shot Relation Extraction via Representation Learning and Domain Adaptation

来源:Arxiv_logoArxiv
英文摘要

Few-shot relation extraction aims to recognize novel relations with few labeled sentences in each relation. Previous metric-based few-shot relation extraction algorithms identify relationships by comparing the prototypes generated by the few labeled sentences embedding with the embeddings of the query sentences using a trained metric function. However, as these domains always have considerable differences from those in the training dataset, the generalization ability of these approaches on unseen relations in many domains is limited. Since the prototype is necessary for obtaining relationships between entities in the latent space, we suggest learning more interpretable and efficient prototypes from prior knowledge and the intrinsic semantics of relations to extract new relations in various domains more effectively. By exploring the relationships between relations using prior information, we effectively improve the prototype representation of relations. By using contrastive learning to make the classification margins between sentence embedding more distinct, the prototype's geometric interpretability is enhanced. Additionally, utilizing a transfer learning approach for the cross-domain problem allows the generation process of the prototype to account for the gap between other domains, making the prototype more robust and enabling the better extraction of associations across multiple domains. The experiment results on the benchmark FewRel dataset demonstrate the advantages of the suggested method over some state-of-the-art approaches.

Zhenkun Wang、Genghui Li、Zhongju Yuan

计算技术、计算机技术

Zhenkun Wang,Genghui Li,Zhongju Yuan.Cross-Domain Few-Shot Relation Extraction via Representation Learning and Domain Adaptation[EB/OL].(2022-12-05)[2025-07-16].https://arxiv.org/abs/2212.02560.点此复制

评论