An open-source neural machine translation toolkit developed by Tsinghua Natural Language Processing Group
Machine translation is a natural language processing task that aims to translate natural languages using computers automatically. Recent several years have witnessed the rapid development of end-to-end neural machine translation, which has become the new mainstream method in practical MT systems.
THUMT is an open-source toolkit for neural machine translation developed by the Natural Language Processing Group at Tsinghua University. The website of THUMT is: http://thumt.thunlp.org/.
The online demo of THUMT is available at http://translate.thumt.cn/. The languages involved include Ancient Chinese, Arabic, Chinese, English, French, German, Indonesian, Japanese, Portuguese, Russian, and Spanish.
THUMT has currently three main implementations:
THUMT-TensorFlow: an implementation developed with TensorFlow. It implements the sequence-to-sequence model (Seq2Seq) (Sutskever et al., 2014), the standard attention-based model (RNNsearch) (Bahdanau et al., 2014), and the Transformer model (Transformer) (Vaswani et al., 2017).
THUMT-Theano: the original project developed with Theano, which is no longer updated because MLA put an end to Theano. It implements the standard attention-based model (RNNsearch) (Bahdanau et al., 2014), minimum risk training (MRT) (Shen et al., 2016) for optimizing model parameters with respect to evaluation metrics, semi-supervised training (SST) (Cheng et al., 2016) for exploiting monolingual corpora to learn bi-directional translation models, and layer-wise relevance propagation (LRP) (Ding et al., 2017) for visualizing and anlayzing RNNsearch.
The following table summarizes the features of three implementations:
| Implementation | Model | Criterion | Optimizer | LRP | | :------------: | :---: | :--------------: | :--------------: | :----------------: | | Theano | RNNsearch | MLE, MRT, SST | SGD, AdaDelta, Adam | RNNsearch | | TensorFlow | Seq2Seq, RNNsearch, Transformer | MLE| Adam | RNNsearch, Transformer | | PyTorch | Transformer | MLE | SGD, Adadelta, Adam | N.A. |
The documentation of PyTorch implementation is avaiable at here.
Please cite the following paper:
Zhixing Tan, Jiacheng Zhang, Xuancheng Huang, Gang Chen, Shuo Wang, Maosong Sun, Huanbo Luan, Yang Liu. THUMT: An Open Source Toolkit for Neural Machine Translation. AMTA 2020.
Jiacheng Zhang, Yanzhuo Ding, Shiqi Shen, Yong Cheng, Maosong Sun, Huanbo Luan, Yang Liu. 2017. THUMT: An Open Source Toolkit for Neural Machine Translation. arXiv:1706.06415.
Theano: Jiacheng Zhang, Yanzhuo Ding, Shiqi Shen, Yong Cheng
TensorFlow: Zhixing Tan, Jiacheng Zhang, Xuancheng Huang, Gang Chen, Shuo Wang, Zonghan Yang
PyTorch: Zhixing Tan, Gang Chen
If you have questions, suggestions and bug reports, please email [email protected].