Most Named entity recognition (NER) methods can only handle flat entities and ignore nested entities. In Natural language processing (NLP), it is common to contain other entities within entities. Therefore, we propose a Flat-Span contrastive learning (Fla-SpaCL) method for nested NER. This method includes two sub-modules: a flat NER module for outer entities and a candidate span classification module based on contrastive learning. In the flat NER module, we use Star-Transformer and Conditional random field (CRF) to identify the outer entities. In the candidate span classification module, we first generate inner candidate spans based on the identified outer entities. Secondly, to better distinguish entity spans and non-entity spans, we introduce contrastive learning to maximize the similarity between entity spans and use the InfoNEC loss function to handle hard negative samples. Finally, multi-task learning is used to jointly optimize the flat NER module and the candidate span classification module to reduce error propagation and improve model performance. In the experimental analysis, we compared the proposed model with baseline models to verify its effectiveness.
A Flat-Span Contrastive Learning Method for Nested Named Entity Recognition
Yaodi Liu,Kun Zhang,Rong Tong,Chenxi Cai,Dianying Chen,Xiaohe Wu
Published 2024 in International Conference on Asian Language Processing
ABSTRACT
PUBLICATION RECORD
- Publication year
2024
- Venue
International Conference on Asian Language Processing
- Publication date
2024-08-04
- Fields of study
Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-36 of 36 references · Page 1 of 1
CITED BY
Showing 1-2 of 2 citing papers · Page 1 of 1