This is the code for "How to Make a Text Summarizer - Intro to Deep Learning #10" by Siraj Raval on Youtube
This is the code for "How to Make a Text Summarizer - Intro to Deep Learning #10" by Siraj Raval on Youtube.
The challenge for this video is to make a text summarizer for a set of articles with Keras. You can use any textual dataset to do this. By doing this you'll learn more about encoder-decoder architecture and the role of attention in deep learning. Good luck!
Use pip to install any missing dependencies.
The video example is made from the text at the start of the article, which I call description (or
desc), and the text of the original headline (or
head). The texts should be already tokenized and the tokens separated by spaces. This is a good example dataset. You can use the 'content' as the 'desc' and the 'title' as the 'head'.
Once you have the data ready save it in a python pickle file as a tuple:
(heads, descs, keywords)were
headsis a list of all the head strings,
descsis a list of all the article strings in the same order and length as
heads. I ignore the
keywrodsinformation so you can place
Here is a link on how to get similar datasets
The predict notebook generate headlines by the trained model and showes the attention weights used to pick words from the description. The text generation includes a feature which was not described in the original paper, it allows for words that are outside the training vocabulary to be copied from the description to the generated headline.
Good (cherry-picked) examples of headlines generated:
The credit for this code goes to udibr i've merely created a wrapper to make it easier to get started.