Neural Abstractive Text Summarization with Sequence-to-Sequence Models
Use following scripts to
In this survey, we run an extensive set of experiments with NATS on the following datasets. Here, we provide the link to CNN/Daily Mail dataset and data processing codes for Newsroom and Bytecup2018 datasets. - CNN/Daily Mail - Newsroom - Bytecup2018
In the dataset, <s> and </s> is used to separate sentences. <sec> is used to separate summaries and articles. We did not use the json format because it takes more space and be difficult to transfer.
Validate:python main.py --task validate
Test:python main.py --task beam
Rouge:python main.py --task rouge
The NATS is equipped with following features:
Attention based seq2seq framework.Encoder and decoder can be LSTM or GRU. The attention scores can be calculated with three different alignment methods.
Intra-temporal attention mechanism and intra-decoder attention mechanism.
Weight sharing mechanism.Weight sharing mechanism can boost the performance with significantly less parameters.
Beam search algorithm.We implemented an efficient beam search algorithm that can also handle cases when batch_size>1.
Unknown words replacement.This meta-algorithm can be used along with any attention based seq2seq model. The OOV words in summaries are manually replaced with words in source articles using attention weights.
Experimental results can be found in our survey paper.
concat + temporal concat + temporal + attn_decoder