Natural Language Processing

ETH Zürich, Spring 2021: Course catalog

Previous Classes: Fall 2020

Course Description

This course presents topics in natural language processing with an emphasis on modern techniques, primarily focusing on statistical and deep learning approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.e processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.

The objective of the course is to learn the basic concepts in the statistical processing of natural languages. The course will be project-oriented so that the students can also gain hands-on experience with state-of-the-art tools and techniques.

Grading

Marks for the course will be determined by the following formula:
* 70% Final Exam
* 30% Course Project/Assignment

Lectures: Wed 12-14h Zoom.

Discussion Sections: Thurs 17-18h; Fri 11-12h Zoom.

Textbooks: Introduction to Natural Language Processing (Eisenstein)
      Deep Learning (Goodfellow, Bengio and Courville)

News

18.02   Class website is online!
18.02   We are using piazza as our discussion forum. Please enroll here.
24.02   First lecture.
04.03   First discussion section.

Syllabus

Week Date   Topic Slides   Readings Supplementary Material
1 24.02.21 Introduction to Natural Language Eisenstein Ch. 1
2 03.03.21 Backpropagation Chris Olah's Blog
Justin Domke’s Notes
Tim Vieira’s Blog
Moritz Hardt’s Notes
Baur and Strassen (1983)
Griewank and Walter (2008)
Eisner (2016)
Computation Graph for MLP
Computation Graph Example
3 10.03.21 Log-Linear Modeling---Meet the Softmax Eisenstein Ch. 2 Ferraro and Eisner (2013)
Jason Eisner’s list of further resources on log-linear modeling
4 17.03.21 Sentiment Analysis with Multi-layer Perceptrons Eisenstein Ch. 3 and Ch. 4
Goodfellow, Bengio and Courville Ch. 6
Wikipedia
Cybenko (1989)
Hanin and Selke (2018)
Pang and Lee (2008)
Iyyer et al. (2015)
word2vec Parameter Learning Explained
word2vec Explained
5 24.03.21 Language Modeling with *n*-grams and LSTMs Eisenstein Ch. 6
Goodfellow, Bengio and Courville Ch. 10
Good Tutorial on n-gram smoothing
Good–Turing Smoothing
Kneser and Ney (1995)
Bengio et al. (2003)
Mikolov et al. (2010)
6 31.03.21 Part-of-Speech Tagging with CRFs Eisenstein Ch. 7 and 8 Tim Vieira's Blog
McCallum et al. (2000)
Lafferty et al. (2001)
Sutton and McCallum (2011)
Koller and Friedman (2009)

Course Project/Assignment

Every student has the option of completing either a research project or a structured assignment. The course project/assigment will be worth 30% of your final mark. The project would be an open-ended research project where students reimplement an existing research paper or perform novel research if they are so inclined. Please find the guidelines below. In the assignment, some of the questions would be more theoretical and resemble the questions you will see on the final exam. However, there may also be a large coding portion in the assignment, which would not look like the exam questions. For instance, we may ask you to implement a recurrent neural dependency parser. Please find the first portion of the assignment and the writeup template below. Assignments must be completed individually. Projects can be completed in groups of up to 4.

Contact

You can ask questions on piazza. Please post questions there, so others can see them and share in the discussion. If you have questions which are not of general interest, please don’t hesitate to contact us directly.

Lecturer Ryan Cotterell
Teaching Assistants Clara Meister, Niklas Stoehr, Selena Pepic, Rita Kuznetsova, Anej Svete, Alexandra Butoi, Anton Rael, David Wissel, Rajai Nasser, Afra Amini, Alberto Pennino, Alexander Immer