Conditioning LSTM Decoder and Bi-directional Attention Based Question Answering System
Title: | Conditioning LSTM Decoder and Bi-directional Attention Based Question Answering System |
---|---|
Authors: | Liu, Heguang |
Publication Year: | 2019 |
Collection: | Computer Science Statistics |
Subject Terms: | Computer Science - Computation and Language, Computer Science - Artificial Intelligence, Computer Science - Machine Learning, Statistics - Machine Learning |
More Details: | Applying neural-networks on Question Answering has gained increasing popularity in recent years. In this paper, I implemented a model with Bi-directional attention flow layer, connected with a Multi-layer LSTM encoder, connected with one start-index decoder and one conditioning end-index decoder. I introduce a new end-index decoder layer, conditioning on start-index output. The Experiment shows this has increased model performance by 15.16%. For prediction, I proposed a new smart-span equation, rewarding both short answer length and high probability in start-index and end-index, which further improved the prediction accuracy. The best single model achieves an F1 score of 73.97% and EM score of 64.95% on test set. Comment: 7 pages, 7 figures |
Document Type: | Working Paper |
Access URL: | http://arxiv.org/abs/1905.02019 |
Accession Number: | edsarx.1905.02019 |
Database: | arXiv |
Description not available. |