Attention-based Convolutional and Recurrent Neural Networks : Success and Limitations in Machine Reading Comprehension

Blohm, Matthias and Jagfeld, Glorianna and Sood, Ekta and Yu, Xiang and Vu, Ngoc Thang (2018) Attention-based Convolutional and Recurrent Neural Networks : Success and Limitations in Machine Reading Comprehension. In: Proceedings of the 22nd Conference on Computational Natural Language Learning (CoNLL 2018) :. Association for Computational Linguistics, BEL, pp. 108-118.

Full text not available from this repository.

Abstract

We propose a machine reading comprehension model based on the compare-aggregate framework with two-staged attention that achieves state-of-the-art results on the MovieQA question answering dataset. To investigate the limitations of our model as well as the behavioral difference between convolutional and recurrent neural networks, we generate adversarial examples to confuse the model and compare to human performance. Furthermore, we assess the generalizability of our model by analyzing its differences to human inference, drawing upon insights from cognitive science.

Item Type:
Contribution in Book/Report/Proceedings
Additional Information:
CoNLL 2018
ID Code:
130536
Deposited By:
Deposited On:
23 Jan 2019 09:55
Refereed?:
Yes
Published?:
Published
Last Modified:
28 Nov 2023 11:00