Transformer-based models for answer extraction in text-based question/answering

Date
2023-04-01
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
The success of transformer-based language models has led to a surge of research in various natural language processing tasks, among which extractive question-answering/answer span detection, has received considerable attention in recent years. However, to date, no comprehensive studies have been conducted to compare and examine the performance of different transformer-based language models in the task of question-answering (QA). Furthermore, while these models can capture significant semantic and syntactic knowledge of a natural language, their potential for enhancing performance, in QA, through the incorporation of linguistic features remains unexplored. In this study, we compare the efficacy of multiple transformer-based models for the task of QA, as well as their performance on particular question types. Moreover, we investigate whether augmenting a set of linguistic features extracted from the question and context passage can enhance the performance of transformer-based language models in QA. In particular, we examine a few feature-augmented transformer-based architectures for the task of QA to explore the impact of these linguistic features on several transformer-based language models. Furthermore, an ablation study is conducted to analyze the individual effect of each feature. Through conducting extensive experiments on two question-answering datasets (i.e., SQuAD and NLQuAD), we show that the proposed framework can improve the performance of transformer-based models.
Description
Keywords
Question-answering, Answer span detection, Transformer-based models, Pre-trained models
Citation