We are a collocation of collaborators working on a diverse range of topics in computational linguistics, natural language processing and machine learning.

Credits to Afra for the lab logo and to Tim for the logo idea.

Current Foci

  • Formal Aspects of Language Modeling
  • Cognitive and (Psycho-)Linguistics
  • Information Theory
  • Computational Typology and Morphology
  • Bias and Fairness in NLP Systems
  • Algorithms for Parsing
  • Interpreting Neural Representations of Language
  • Computational Social Science

Lab News

  • 13/02/23 Ryan is giving an invited talk at the Dagstuhl Seminar.
  • 19/07/22 David Mortensen (CMU) is giving an invited talk
  • 04/07/22 Noga Zaslavsky (MIT) is giving an invited talk
  • 10/06/22 Aaron Schein (University of Columbia) is giving an invited talk

  • 23/05/22 Ryan gives STIRS Keynote at ACL 2022

People

Senior Members

Avatar

Patrizia Napoli

Administrative Assistant

ETH Zürich

Avatar

Ryan Cotterell

Assistant Professor of Computer Science

ETH Zürich

PhD Students

Alumna

Avatar

Josef Valvoda

Alumnus

University of Cambridge

Postdoc at University of Copenhagen

*Alumna is the nominative plural of alumnum, which means student in Latin. This a gender neutral (or, better said, gender neuter) version of the gendered words alumni and alumnae.

Teaching

SEE ALL CLASSES

Natural Language Processing

ETH Zürich Fall 2024
This course presents topics in natural language processing with an emphasis on modern techniques, primarily focusing on statistical and deep learning approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.

Neural Networks and Computational Complexity

ETH Zürich Fall 2024
This Bachelor’s seminar delves into the fascinating world of modern large language models (LLMs), which have revolutionized natural language processing. As these models continue to evolve and impact various domains, we will explore their potential, limitations, and underlying mechanisms through a theroretical lens. Throughout the seminar, we will address the following key questions: what are the real capabilities of large language models? What are their inherent limitations? How do these models function at a fundamental level? Under what circumstances are they likely to fail? Can we develop a comprehensive “science of LLMs” to address these inquiries? We will leverage formal language theory to provide a rigorous framework for understanding the representational capacity of neural language models.

NLP in the Wild

ETH Zürich Spring 2024
In recent years, NLP has become a part of our daily lives. Many of us use tools like Google Translate to understand sentences in languages we don’t know, and chatbots like ChatGPT to help draft essays and answer basic questions. However, even though most people recognize the utility of such tools, there are still many questions to be answered about their reliability and their impact on society. For example, to what extent can we or should we trust what ChatGPT says? Should chatbots ever be used in legal decision-making? What is the role that NLP should play in the education system? In this open-ended seminar, we will read and discuss opinions on the proper use of NLP in the real world, or as we term it, NLP in the wild!

Large Language Models

ETH Zürich Spring 2024
Large language models have become one of the most commonly deployed NLP inventions. In the past half-decade, their integration into core natural language processing tools has dramatically increased the performance of such tools, and they have entered the public discourse surrounding artificial intelligence. In this course, we start with the probabilistic foundations of language models, i.e., covering what constitutes a language model from a formal, theoretical perspective. We then discuss how to construct and curate training corpora, and introduce many of the neural-network architectures often used to instantiate language models at scale. The course covers aspects of systems programming, discussion of privacy and harms, as well as applications of language models in NLP and beyond.

Advanced Formal Language Theory

ETH Zürich Spring 2024
This course serves as an introduction to various advanced topics in formal language theory. The primary focus of the course is on weighted formalisms, which can easily be applied in machine learning. Topics include finite-state machines as well as the algorithms that are commonly used for their manipulation. We will also cover weighted context-free grammars, weighted pushdown automata, weighted tree automata, and weighted mildly context-sensitive formalisms.

Philosophy of Language and Computation I

ETH Zürich Spring 2024
This graduate class, partly taught like a seminar, is designed to help you understand the philosophical underpinnings of modern work in natural language processing (NLP), most of which centered around statistical machine learning applied to natural language data.

Generating Text from Language Models

ACL (Toronto) July 2023
In this tutorial, we will provide a centralized and cohesive discussion of critical considerations when choosing how to generate text from a language model. We will cover a wide range of empirically-observed problems (like degradation, hallucination, repetition) and their corresponding proposed algorithmic solutions from recent research (like top-p sampling and its successors). We will then cover methods in controlled generation, that go beyond just ensuring coherence to ensure text exhibits specific desired properties.

Formal Language Theory and Neural Networks

ESSLLI (Ljubljana, Slovenia) Spring 2023

Thesis Projects

If you are a MSc student at ETH Zurich interested in writing your thesis with us, we would be delighted to hear from you! Unfortunately, we do not have the capacity to consider students from outside ETH for thesis projects. Our research revolves around theoretical and applied problems in Natural Language Processing, Computational Linguistics, Machine Learning and Statistics. To obtain a better understanding of what currently interests us, we invite you to check our recent publications. However, feel free to express interest in any topic you think our group might be well suited to advise you on: Just because we have not yet looked into a topic does not mean we are not interested in it or willing to become interested in the topic.
 
Please send an email to ryan.cotterell@inf.ethz.ch with CC to afra.amini@inf.ethz.ch, anej.svete@inf.ethz.ch, and niklas.stoehr@inf.ethz.ch. State either [bachelor’s thesis] or [master’s thesis] at the start of the subject. For us to get to know you a little, please write a paragraph introducing yourself and why you are interested in working with us. It would help us a lot if you also provided a list of four or five more concrete topics that you are interested in. We will try our best to find a project that suits your interests. We are looking forward to receiving your inquiry!

Joining Our Lab

Thank you very much for your interest in joining our group – unfortunately, we are not accepting PhD students anymore!

If you are interested in working with us as a Master’s student, please see here. Ryan has previously co-advised Master’s students on NLP topics with Mrinmaya Sachan and others, if co-advising is an option you would like to pursue.