Resolving passage ambiguity in machine reading comprehension using lightweight transformer architectures

Adnan Nawaz,Muzamil Ahmed,Hikmat Ullah Khan,Ali Daud,Bader Alshemaimri,Tassawar Iqbal

Published 2025 in Scientific Reports

ABSTRACT

Machine Reading Comprehension (MRC) refers to generating precise responses to the users’ queries from text content using natural language processing. The exponential growth and complexities of online content have made it difficult to surf the required information by navigating through several web pages to retrieve precise and accurate answers to the users’ questions. Therefore, MRC has emerged as an active and growing research area in recent years. The existing studies highlight the significance of deep learning models yet lack in resolving ambiguity, especially in complex passages. Bidirectional encoder representations from transformers have addressed passage ambiguity resolution, but their complexity results in the demand for high computational resources and a large volume of data for better text comprehension. To address passage ambiguities and reduce computational costs, this study fine-tunes the DistilBERT model for the MRC task. The resulting model termed Distil-BERT-MRC uses a reduced architecture ensuring efficiency while maintaining competitive performance. The results of the detailed analysis demonstrate that Distil-BERT-MRC attained up to 90.23% exact match and 91.42% F1 score on the WikiQA dataset. Moreover, to assess the generalizability and resource utilization, extensive experiments were performed on SQuAD 2.0, NewsQA, and Natural Questions using recent transformer models, including RoBERTa and XLNet. Overall, our findings confirm that distilled transformer models provide a resource-efficient and effective approach for MRC tasks.

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-49 of 49 references · Page 1 of 1

CITED BY

  • No citing papers are available for this paper.

Showing 0-0 of 0 citing papers · Page 1 of 1