LabelPrompt: Effective Prompt-based Learning for Relation Classification

Bibliographic Details
Title: LabelPrompt: Effective Prompt-based Learning for Relation Classification
Authors: Zhang, Wenjie, Song, Xiaoning, Feng, Zhenhua, Xu, Tianyang, Wu, Xiaojun
Publication Year: 2023
Collection: Computer Science
Subject Terms: Computer Science - Computation and Language, Computer Science - Artificial Intelligence, Computer Science - Information Retrieval, Computer Science - Machine Learning
More Details: Recently, prompt-based learning has gained popularity across many natural language processing (NLP) tasks by reformulating them into a cloze-style format to better align pre-trained language models (PLMs) with downstream tasks. However, applying this approach to relation classification poses unique challenges. Specifically, associating natural language words that fill the masked token with semantic relation labels (\textit{e.g.} \textit{``org:founded\_by}'') is difficult. To address this challenge, this paper presents a novel prompt-based learning method, namely LabelPrompt, for the relation classification task. Motivated by the intuition to ``GIVE MODEL CHOICES!'', we first define additional tokens to represent relation labels, which regard these tokens as the verbaliser with semantic initialisation and explicitly construct them with a prompt template method. Then, to mitigate inconsistency between predicted relations and given entities, we implement an entity-aware module with contrastive learning. Last, we conduct an attention query strategy within the self-attention layer to differentiates prompt tokens and sequence tokens. Together, these strategies enhance the adaptability of prompt-based learning, especially when only small labelled datasets is available. Comprehensive experiments on benchmark datasets demonstrate the superiority of our method, particularly in the few-shot scenario.
Comment: 20 pages, 5 figures
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2302.08068
Accession Number: edsarx.2302.08068
Database: arXiv
More Details
Description not available.