Course Homepage
Course Description
Natural language processing is ubiquitous in modern intelligent technologies, serving as a foundation for language translators, virtual assistants, search engines, and many more. In this course, we cover the foundations of modern methods for natural language processing, such as word embeddings, recurrent neural networks, transformers, and pretraining, and how they can be applied to important tasks in the field, such as machine translation and text classification. We also cover issues with these state-of-the-art approaches (such as robustness, interpretability, sensitivity), identify their failure modes in different NLP applications, and discuss analysis and mitigation techniques for these issues.
Quick access links:
Class
| Platform | Where & when |
|---|---|
| Lectures | Wednesdays: 11:15-13:00 [STCC - Cloud C] & Thursdays: 13:15-14:00 [CE16] |
| Exercises Session | Thursdays: 14:15-16:00 [CE11] |
| Project Assistance (not every week) | Wednesdays: 13:15-14:00 [STCC - Cloud C] |
| QA Forum & Annoucements | Ed Forum [link] |
| Grades | Moodle [link] |
| Lecture Recordings | Mediaspace [link] |
All lectures will be given in person and live streamed on Zoom. The link to the Zoom is available on the Ed Forum (pinned post). Beware that, in the event of a technical failure during the lecture, continuing to accompany the lecture live via zoom might not be possible.
Recording of the lectures will be made available on Mediaspace. We will reuse some of last year's recordings and we may record a few new lectures in case of different lecture contents.
Lecture Schedule
| Week | Date | Topic | Suggested Reading | Instructor |
| Week 1 | 18 Feb 19 Feb | Introduction | Building a simple neural classifier [slides] [video] Word embeddings [slides] [video] | Antoine Bosselut | |
| Week 2 | 25 Feb 26 Feb | Classical LMs | Neural LMs: Fixed Context Models [slides] [video] Neural LMs: RNNs [slides] [video] | Suggested reading: | Antoine Bosselut |
| Week 3 | 4 Mar 5 Mar | Sequence-to-sequence Models | Transformers [slides] Tokenization [slides] | Suggested reading:
| Antoine Bosselut |
Exercise Schedule
| Week | Release Date | Exercise Session Date | Topic | Instructor |
|---|---|---|---|---|
| Week 1 | 19 Feb | 26 Feb | Intro + Setup | Madhur Panwar |
| Week 2 | 26 Feb | 5 Mar | LMs + Neural LMs: fixed-context models Language and Sequence-to-sequence models | Badr AlKhamissi |
| Week 3 | 5 Mar | 12 Mar | Attention + Transformers + Tokenization | Badr AlKhamissi |
| Week 4 | 12 Mar | 19 Mar | Pretrained LLMs | Badr AlKhamissi |
| Week 5 | 19 Mar | 26 Mar | Transfer Learning | Madhur Panwar |
| Week 6 | 26 Mar | 2 Apr | Text Generation | Madhur Panwar |
| Week 7 | 1 Apr | 2 Apr | In-context Learning + Post-training | TBD |
Contacts
Please email us at nlp-cs552-spring2026-ta-team [at] groupes [dot] epfl [dot] ch for any administrative questions, rather than emailing TAs individually. All course content questions need to be asked via Ed.
Lecturer: Antoine Bosselut
Teaching assistants: Madhur Panwar, Badr AlKhamissi, Zeming (Eric) Chen, Sepideh Mamooler, Ayush Tarun, Lazar Milikic