A TensorFlow implementation of Recurrent Neural Networks for Sequence Classification and Sequence Labeling
Tensorflow implementation of attention-based LSTM models for sequence classification and sequence labeling.
Updates - 2017/07/29 * Updated code to work with the latest TensorFlow API: r1.2 * Code cleanup and formatting * Note that this published code does not include the modeling of output label dependencies. One may add a loop function as in the rnndecoder function in TensorFlow <a href="https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/legacyseq2seq/python/ops/seq2seq.py#L292" target="blank">seq2seq.py example to feed emitted label embedding back to RNN state. Alternatively, sequence level optimization can be performed by adding a <a href="https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/crf" target="blank">CRF layer on top of the RNN outputs. * The dataset used in the paper can be found at: https://github.com/yvchen/JointSLU/tree/master/data. We used the training set in the original ATIS train/test split, which has 4978 training samples. There are 15 test samples that have multiple intent labels for an utterance. We used the more frequent label (most likely, "flight") as the true label during evaluation.
Setup
Usage: ```bash datadir=data/ATISsamples modeldir=modeltmp maxsequencelength=50 # max length for train/valid/test sequence task=joint # available options: intent; tagging; joint bidirectionalrnn=True # available options: True; False useattention=True # available options: True; False
python runmulti-taskrnn.py --datadir $datadir \ --traindir $modeldir\ --maxsequencelength $maxsequencelength \ --task $task \ --bidirectionalrnn $bidirectionalrnn \ --useattention $useattention ```
Reference
@inproceedings{Liu+2016, author={Bing Liu and Ian Lane}, title={Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling}, year=2016, booktitle={Interspeech 2016}, doi={10.21437/Interspeech.2016-1352}, url={http://dx.doi.org/10.21437/Interspeech.2016-1352}, pages={685--689} }
Contact
Feel free to email [email protected] for any pertinent questions/bugs regarding the code.