Need help with Tree-Transformer?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

yaushian
145 Stars 20 Forks 16 Commits 2 Opened issues

Description

Implementation of the paper Tree Transformer

Services available

!
?

Need anything else?

Contributors list

# 265,200
Haskell
Shell
query-l...
natural...
3 commits

Tree Transformer

This is the official implementation of the paper Tree Transformer: Integrating Tree Structures into Self-Attention. If you use this code or our results in your research, we'd appreciate you cite our paper as following:

@article{Wang2019TreeTransformer,
  title={Tree Transformer: Integrating Tree Structures into Self-Attention},
  author={Yau-Shian Wang and Hung-Yi Lee and Yun-Nung Chen},
  journal={arXiv preprint arXiv:1909.06639},
  year={2019}
}

Dependencies

  • python3
  • pytorch 1.0

We use BERT tokenizer from PyTorch-Transformers to tokenize words. Please install PyTorch-Transformers following the instructions of the repository.

Training

For grammar induction training:

python3 main.py -train -model_dir [model_dir] -num_step 60000

The default setting achieves F1 of approximatedly 49.5 on WSJ test set. The training file 'data/train.txt' includes all WSJ data except 'WSJ22 and WSJ23'.

Evaluation

For grammar induction testing:

python3 main.py -test -model_dir [model_dir]

The code creates a result directory named modeldir. The result directory includes 'bracket.json' and 'tree.txt'. File 'bracket.json' contains the brackets of trees outputted from the model and they can be used for evaluating F1. The ground truth brackets of testing data can be obtained by using code of on-lstm. File 'tree.txt' contains the parse trees. The default testing file 'data/test.txt' contains the tests of wsj23.

Acknowledgements

Contact

[email protected]

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.