Transformer-based models for answer extraction in text-based question/answering

dc.contributor.advisorDavoudi, Heidar (Kourosh)
dc.contributor.authorAhmadi Najafabadi, Marzieh
dc.date.accessioned2023-04-24T15:19:18Z
dc.date.available2023-04-24T15:19:18Z
dc.date.issued2023-04-01
dc.degree.disciplineComputer Science
dc.degree.levelMaster of Science (MSc)
dc.description.abstractThe success of transformer-based language models has led to a surge of research in various natural language processing tasks, among which extractive question-answering/answer span detection, has received considerable attention in recent years. However, to date, no comprehensive studies have been conducted to compare and examine the performance of different transformer-based language models in the task of question-answering (QA). Furthermore, while these models can capture significant semantic and syntactic knowledge of a natural language, their potential for enhancing performance, in QA, through the incorporation of linguistic features remains unexplored. In this study, we compare the efficacy of multiple transformer-based models for the task of QA, as well as their performance on particular question types. Moreover, we investigate whether augmenting a set of linguistic features extracted from the question and context passage can enhance the performance of transformer-based language models in QA. In particular, we examine a few feature-augmented transformer-based architectures for the task of QA to explore the impact of these linguistic features on several transformer-based language models. Furthermore, an ablation study is conducted to analyze the individual effect of each feature. Through conducting extensive experiments on two question-answering datasets (i.e., SQuAD and NLQuAD), we show that the proposed framework can improve the performance of transformer-based models.en
dc.description.sponsorshipUniversity of Ontario Institute of Technologyen
dc.identifier.urihttps://hdl.handle.net/10155/1598
dc.language.isoenen
dc.subjectQuestion-answeringen
dc.subjectAnswer span detectionen
dc.subjectTransformer-based modelsen
dc.subjectPre-trained modelsen
dc.titleTransformer-based models for answer extraction in text-based question/answeringen
dc.typeThesisen
thesis.degree.disciplineComputer Science
thesis.degree.grantorUniversity of Ontario Institute of Technology
thesis.degree.nameMaster of Science (MSc)

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Ahmadi_Najafabadi_Marzieh.pdf
Size:
11.58 MB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.68 KB
Format:
Item-specific license agreed upon to submission
Description: