We explored several approaches to incorporate context information in the deep learning framework for text classification, including designing different attention mechanisms based on different neural network and extracting some additional features from text by traditional methods as the part of representation. We propose two kinds of classification algorithms: one is based on convolutional neural network fusing context information and the other is based on bidirectional long and short time memory network. We integrate the context information into the final feature representation by designing attention structures at sentence level and word level, which increases the diversity of feature information. Our experimental results on two datasets validate the advantages of the two models in terms of time efficiency and accuracy compared to the different models with fundamental AM architectures.
Leveraging Contextual Sentences for Text Classification by Using a Neural Attention Model
Published 2019 in Computational Intelligence and Neuroscience
ABSTRACT
PUBLICATION RECORD
- Publication year
2019
- Venue
Computational Intelligence and Neuroscience
- Publication date
2019-08-01
- Fields of study
Medicine, Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar, PubMed
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-41 of 41 references · Page 1 of 1
CITED BY
Showing 1-12 of 12 citing papers · Page 1 of 1