Need help with Question-Answering?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

131 Stars 70 Forks 26 Commits 6 Opened issues


TensorFlow implementation of Match-LSTM and Answer pointer for the popular SQuAD dataset.

Services available


Need anything else?

Contributors list

# 180,247
24 commits
# 460,881
1 commit

Match-LSTM and Answer Pointer (Wang and Jiang, ICLR 2016)

This repo attempts to reproduce the match-lstm and answer pointer experiments from the 2016 paper on the same. A lot of the preprocessing boiler code is taken from Stanford CS224D.

The meat of the code is in I had to modify tensorflow's original attention mechanism implementation for the code to be correct. run to train the model and to generate answers given a set of paragraphs. Contact me at [email protected] for more info.

This code also serves as an example code showing how tensorflow's attention mechanism can be wired together. As of August 13th, 2017, such an example was not available anywhere.


Before training, you're going to want to do some preprocessing of the data. Run the following from the command line:

$ python preprocessing/
$ python preprocessing/
$ python

The last step can take a bit of time (~30 minutes).


After preprocessing is complete, you can train your model by running the following command:

$ python

Note that depending on your configs, this model will train for a very long time! Given the default configs, on a modern laptop computer this will trian for multiple (~2) hours per epoch, and the default is 30 epochs (~60 hours).

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.