We are a collocation of collaborators working on a diverse range of topics in computational linguistics, natural language processing and machine learning.

Lab Motto: We put the fun in funicular!

Current Foci

  • Decoding Strategies for Neural MT
  • Information-Theoretic Linguistics
  • Computational Typology
  • Computational Morphology
  • Bias and Fairness in NLP Systems
  • Computational Approaches to Metaphor
  • Low-resource Linguistic Annotation
  • Algorithms for Parsing
  • Interpreting Neural Representations of Language
  • Computational Social Science
  • NLP Applications

Lab News

  • Fall 2021: Ryan is giving invited talks at EPFL, University of Melbourne, University of Notre Dame, Gothenburg University and MIT.

  • 26/8: 7 papers accepted to EMNLP 2021; 4 papers accepted to Findings of the ACL: EMNLP 2021

    • Revisiting the Uniform Information Density Hypothesis
    • A Bayesian Framework for Information-Theoretic Probing
    • Efficient Sampling of Dependency Structure
    • Phone-level Uniform Information Density across and within Languages
    • On Homophony and Rényi Entropy
    • Conditional Poisson Stochastic Beams
    • Text or Topology? Classifying Ally-Enemy Pairs in Militarised Conflict
    • Equivariant Transduction through Invariant Alignment
    • Searching for More Efficient Dynamic Programs
    • Keyword2Text: A Plug-and-Play Method for Controlled Text Generation
    • Adjusting the Conflict-Cooperation Scale for Armed Conflict Assessment
  • 5/5: 7 papers accepted at ACL 2021; 1 paper accepted to Findings of the ACL

    • A cognitive regularizer for language modeling
    • On Finding the K-best Non-projective Dependency Trees
    • Determinantal Beam Search
    • Is Sparse Attention more Interpretable?
    • Language Model Evaluation Beyond Perplexity
    • Examining the Inductive Bias of Neural Language Models with Artificial Languages
    • Higher-order Derivatives of Weighted Finite-state Machines
    • Modelling the Unigram Distribution
  • 10/3: 5 papers (3 long, 2 short) accepted to NAACL 2021

    • A Non-Linear Structural Probe
    • What About the Precedent: An Information-Theoretic Analysis of Common Law
    • Do Syntactic Probes Probe Syntax? Experiments with Jabberwocky Probing
    • Finding Concept-specific Biases in Form–Meaning Associations
    • How (Non-)Optimal is the Lexicon?
  • 25/2: Ryan is giving an invited talk at the ELLIS NLP Workshop on joint work with Clara and Tim

  • 28/1: 3 papers accepted to EACL 2021

    • Disambiguatory signals are stronger in word initial positions
    • Searching for Search Errors in Neural Morphological Inflection
    • Applying the Transformer to Character-level Transduction




Gian Wiher

Research Assistant

ETH Zürich


Jiaoda Li

Master’s Student

ETH Zürich


Selena Pepić

Master’s Student

ETH Zürich

Frequent Collaborators


Isabelle Augenstein

Associate Professor of Computer Science

University of Copenhagen


Mans Hulden

Associate Professor of Linguistics

University of Colorado Boulder


Simone Teufel

Professor of Computer Science

University of Cambridge


*Alumna is the nominative plural of alumnum, which means student in Latin. This a gender neutral (or, better said, gender neuter) version of the gendered words alumni and alumnae.



Natural Language Processing

ETH Zürich Fall 2021
This course presents topics in natural language processing with an emphasis on modern techniques, primarily focusing on statistical and deep learning approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.

Dependency Structures and Lexicalized Grammars

ETH Zürich Fall 2021
Dependency parsing is a fundamental task in natural language processing. This seminar explores a variety of algorithms for efficient dependency parsing and their derivatioin in a unified algebraic framework. The core ideas behind the mathematics of dependency parsing are explored.

ESSLLI 2021: Information Theory in Linguistics

Virtual Summer 2021
The European Summer School in Logic, Language and Information Summer School (ESSLLI) provides an interdisciplinary setting in which courses and workshops are offered in logic, linguistics and computer science. The Information Theory in Linguistics course focuses on the application of information-theoretic methods to natural language processing, emphasizing interdisciplinary connections with the field of linguistics.

Thesis Projects

If you are a MSc student at ETH Zurich interested in writing your thesis with us, we would be delighted to hear from you! Unfortunately, we do not have the capacity to consider students from outside ETH for thesis projects. Our research revolves around theoretical and applied problems in Natural Language Processing, Computational Linguistics, Machine Learning and Statistics. To obtain a better understanding of what currently interests us, we invite you to check our recent publications. However, feel free to express interest in any topic you think our group might be well suited to advise you on: Just because we have not yet looked into a topic does not mean we are not interested in it or willing to become interested in the topic.
Please send an email to ryan.cotterell@inf.ethz.ch with CC to clara.meister@inf.ethz.ch and niklas.stoehr@inf.ethz.ch and state either [bachelor’s thesis] or [master’s thesis] at the start of the subject. For us to get to know you a little, please write a paragraph introducing yourself and why you are interested in working with us. It would help us a lot if you also provided a list of four or five more concrete topics that you are interested in. We will try our best to find a project that suits your interests. We are looking forward to receiving your inquiry!

ETHZ++ NLP Reading Group


Our lab, along with Mrinmaya Sachan’s lab, meets weekly on Fridays at 15:00 CEST. Here’s what we’re reading!

04/12/19: Semantic categories of artifacts and animals reflect efficient coding
11/12/19: SpanBERT: Improving Pre-training by Representing and Predicting Spans
15/01/20: Gender-Aware Reinflection using Linguistically Enhanced Neural Models

Joining Our Lab

Thank you very much for your interest in joining our group – unfortunately, we are not accepting PhD students anymore!

If you are interested in working with us as a Master’s student, please see here. Ryan has previously co-advised Master’s students on NLP topics with Mrinmaya Sachan and others, if co-advising is an option you would like to pursue. At Cambridge, Ryan co-advises MPhil students with Simone Teufel. We are looking forward to receiving your inquiry!