Need help with rnn-nlu?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

HadoopIt
469 Stars 170 Forks 12 Commits 13 Opened issues

Description

A TensorFlow implementation of Recurrent Neural Networks for Sequence Classification and Sequence Labeling

Services available

!
?

Need anything else?

Contributors list

# 72,072
Python
Perl
Tensorf...
slot-fi...
10 commits
# 197,892
Shell
Tensorf...
ml
Perl
1 commit

Attention-based RNN model for Spoken Language Understanding (Intent Detection & Slot Filling)

Tensorflow implementation of attention-based LSTM models for sequence classification and sequence labeling.

Updates - 2017/07/29 * Updated code to work with the latest TensorFlow API: r1.2 * Code cleanup and formatting * Note that this published code does not include the modeling of output label dependencies. One may add a loop function as in the rnndecoder function in TensorFlow <a href="https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/legacyseq2seq/python/ops/seq2seq.py#L292" target="blank">seq2seq.py example to feed emitted label embedding back to RNN state. Alternatively, sequence level optimization can be performed by adding a <a href="https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/crf" target="blank">CRF layer on top of the RNN outputs. * The dataset used in the paper can be found at: https://github.com/yvchen/JointSLU/tree/master/data. We used the training set in the original ATIS train/test split, which has 4978 training samples. There are 15 test samples that have multiple intent labels for an utterance. We used the more frequent label (most likely, "flight") as the true label during evaluation.

Setup

  • TensorFlow, version r1.2 (https://www.tensorflow.org/api_docs/)

Usage: ```bash datadir=data/ATISsamples modeldir=modeltmp maxsequencelength=50 # max length for train/valid/test sequence task=joint # available options: intent; tagging; joint bidirectionalrnn=True # available options: True; False useattention=True # available options: True; False

python runmulti-taskrnn.py --datadir $datadir \ --traindir $modeldir\ --maxsequencelength $maxsequencelength \ --task $task \ --bidirectionalrnn $bidirectionalrnn \ --useattention $useattention ```

Reference

  • Bing Liu, Ian Lane, "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling", Interspeech, 2016 (PDF)
@inproceedings{Liu+2016,
author={Bing Liu and Ian Lane},
title={Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling},
year=2016,
booktitle={Interspeech 2016},
doi={10.21437/Interspeech.2016-1352},
url={http://dx.doi.org/10.21437/Interspeech.2016-1352},
pages={685--689}
}

Contact

Feel free to email [email protected] for any pertinent questions/bugs regarding the code.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.