by clab

Tutorial on "Practical Neural Networks for NLP: From Theory to Code" at EMNLP 2016

427 Stars 123 Forks Last release: Not found 25 Commits 0 Releases

Available items

No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:

Practical Neural Networks for NLP

A tutorial given by Chris Dyer, Yoav Goldberg, and Graham Neubig at EMNLP 2016 in Austin. The tutorial covers the basic of neural networks for NLP, and how to implement a variety of networks simply and efficiently in the DyNet toolkit.

  • Slides, part 1: Basics

    • Computation graphs and their construction
    • Neural networks in DyNet
    • Recurrent neural networks
    • Minibatching
    • Adding new differentiable functions
  • Slides, part 2: Case studies in NLP

    • Tagging with bidirectional RNNs and character-based embeddings
    • Transition-based dependency parsing
    • Structured prediction meets deep learning

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.