Need help with BAMnet?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

137 Stars 28 Forks Apache License 2.0 14 Commits 4 Opened issues


Code & data accompanying the NAACL 2019 paper "Bidirectional Attentive Memory Networks for Question Answering over Knowledge Bases"

Services available


Need anything else?

Contributors list


Code & data accompanying the NAACL2019 paper "Bidirectional Attentive Memory Networks for Question Answering over Knowledge Bases"

Get started


This code is written in python 3. You will need to install a few python packages in order to run the code. We recommend you to use

to manage your python packages and environments. Please take the following steps to create a python virtual environment.
  • If you have not installed
    , install it with
    pip install virtualenv
  • Create a virtual environment with
    virtualenv venv
  • Activate the virtual environment with
    source venv/bin/activate
  • Install the package requirements with
    pip install -r requirements.txt

Run the KBQA system

  • Download the preprocessed data from here and put the data folder under the root directory.

  • Create a folder (e.g.,

    ) to save model checkpoint. You can download the pretrained models from here. (Note: if you cannot access the above data and pretrained models, please download from here.)
  • Please modify the config files in the

    folder to suit your needs. Note that you can start with modifying only the data folder (e.g.,
    ) and vocab size (e.g.,
    ), and leave other hyperparameters as they are.
  • Go to the

    folder, train the BAMnet model
    python -config config/bamnet_webq.yml
  • Test the BAMnet model (with ground-truth topic entity)

    python -config config/bamnet_webq.yml
  • Train the topic entity predictor

    python -config config/entnet_webq.yml
  • Test the topic entity predictor

    python -config config/entnet_webq.yml
  • Test the whole system (BAMnet + topic entity predictor)

    python -bamnet_config config/bamnet_webq.yml -entnet_config config/entnet_webq.yml -raw_data ../data/WebQ

Preprocess the dataset on your own

  • Go to the

    folder, to prepare data for the BAMnet model, run the following cmd:
    python -data_dir ../data/WebQ -fb_dir ../data/WebQ -out_dir ../data/WebQ
  • To prepare data for the topic entity predictor model, run the following cmd:

    python -dtype ent -data_dir ../data/WebQ -fb_dir ../data/WebQ -out_dir ../data/WebQ

Note that in the message printed out, your will see some data statistics such as

. These numbers will be used later when modifying the config files.
  • Download the pretrained Glove word ebeddings

  • Unzip the file and convert glove format to word2vec format using the following cmd:

    python -m gensim.scripts.glove2word2vec --input glove.840B.300d.txt --output glove.840B.300d.w2v
  • Fetch the pretrained Glove vectors for our vocabulary.

    python -emb glove.840B.300d.w2v -data_dir ../data/WebQ -out ../data/WebQ/glove_pretrained_300d_w2v.npy -emb_size 300


Experiment results on WebQuestions

Results on WebQuestions test set. Bold: best in-category performance.

Predicted answers of BAMnet w/ and w/o bidirectional attention on the WebQuestions test set


Attention heatmap generated by the reasoning module



If you found this code useful, please consider citing the following paper:

Yu Chen, Lingfei Wu, Mohammed J. Zaki. "Bidirectional Attentive Memory Networks for Question Answering over Knowledge Bases." In Proc. 2019 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL-HLT2019). June 2019.

  title={Bidirectional Attentive Memory Networks for Question Answering over Knowledge Bases},
  author={Chen, Yu and Wu, Lingfei and Zaki, Mohammed J},
  journal={arXiv preprint arXiv:1903.02188},

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.