nlp_course

by yandexdataschool

yandexdataschool / nlp_course

YSDA course in Natural Language Processing

6.3K Stars 1.8K Forks Last release: Not found MIT License 511 Commits 0 Releases

Available items

No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:

YSDA Natural Language Processing course

  • This is the 2020 version. For previous year' course materials, go to this branch
  • Lecture and seminar materials for each week are in ./week* folders, see README.md for materials and instructions
  • YSDA homework deadlines will be listed in Anytask (read more).
  • Any technical issues, ideas, bugs in course materials, contribution ideas - add an issue
  • Installing libraries and troubleshooting: this thread.

Syllabus

  • week01 Word Embeddings

    • Lecture: Word embeddings. Distributional semantics. Count-based (pre-neural) methods. Word2Vec: learn vectors. GloVe: count, then learn. Evaluation: intrinsic vs extrinsic. Analysis and Interpretability. Interactive lecture materials and more.
    • Seminar: Playing with word and sentence embeddings
    • Homework: Embedding-based machine translation system
  • week02 Text Classification

    • Lecture: Text classification: introduction and datasets. General framework: feature extractor + classifier. Classical approaches: Naive Bayes, MaxEnt (Logistic Regression), SVM. Neural Networks: General View, Convolutional Models, Recurrent Models. Practical Tips: Data Augmentation. Analysis and Interpretability. Interactive lecture materials and more.
    • Seminar: Text classification with convolutional NNs.
    • Homework: Statistical & neural text classification.
  • week03 Language Modeling

    • Lecture: Language Modeling: what does it mean? Left-to-right framework. N-gram language models. Neural Language Models: General View, Recurrent Models, Convolutional Models. Evaluation. Practical Tips: Weight Tying. Analysis and Interpretability. Interactive lecture materials and more.
    • Seminar: Build a N-gram language model from scratch
    • Homework: Neural LMs & smoothing in count-based models.
  • week04 Seq2seq and Attention

    • Lecture: Seq2seq Basics: Encoder-Decoder framework, Training, Simple Models, Inference (e.g., beam search). Attention: general, score functions, models. Transformer: self-attention, masked self-attention, multi-head attention; model architecture. Subword Segmentation (BPE). Analysis and Interpretability: functions of attention heads; probing for linguistic structure. Interactive lecture materials and more.
    • Seminar: Basic sequence to sequence model
    • Homework: Machine translation with attention

More TBA

Contributors & course staff

Course materials and teaching performed by - Elena Voita - course admin, lectures, seminars, homeworks - Boris Kovarsky - lectures, seminars, homeworks - David Talbot - lectures, seminars, homeworks - Sergey Gubanov - lectures, seminars, homeworks - Just Heuristic - lectures, seminars, homeworks

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.