We are a collocation of collaborators working on a diverse range of topics in computational linguistics, natural language processing and machine learning.

Lab Motto: We put the fun in funicular!

Current Foci

  • Decoding Strategies for Neural MT
  • Information-Theoretic Linguistics
  • Computational Typology
  • Computational Morphology
  • Bias and Fairness in NLP Systems
  • Computational Approaches to Metaphor
  • Low-resource Linguistic Annotation
  • Algorithms for Parsing
  • Interpreting Neural Representations of Language
  • Computational Social Science
  • NLP Applications

Lab News

  • 26/8: 7 papers accepted to EMNLP 2021; 4 papers accepted to Findings of the ACL: EMNLP 2021
    • Revisiting the Uniform Information Density Hypothesis
    • A Bayesian Framework for Information-Theoretic Probing
    • Efficient Sampling of Dependency Structure
    • Phone-level Uniform Information Density across and within Languages
    • On Homophony and Rényi Entropy
    • Conditional Poisson Stochastic Beams
    • Text or Topology? Classifying Ally-Enemy Pairs in Militarised Conflict
    • Equivariant Transduction through Invariant Alignment
    • Searching for More Efficient Dynamic Programs
    • Keyword2Text: A Plug-and-Play Method for Controlled Text Generation
    • Adjusting the Conflict-Cooperation Scale for Armed Conflict Assessment
  • 5/5: 7 papers accepted at ACL 2021; 1 paper accepted to Findings of the ACL
    • A cognitive regularizer for language modeling
    • On Finding the K-best Non-projective Dependency Trees
    • Determinantal Beam Search
    • Is Sparse Attention more Interpretable?
    • Language Model Evaluation Beyond Perplexity
    • Examining the Inductive Bias of Neural Language Models with Artificial Languages
    • Higher-order Derivatives of Weighted Finite-state Machines
    • Modelling the Unigram Distribution
  • 10/3: 5 papers (3 long, 2 short) accepted to NAACL 2021
    • A Non-Linear Structural Probe
    • What About the Precedent: An Information-Theoretic Analysis of Common Law
    • Do Syntactic Probes Probe Syntax? Experiments with Jabberwocky Probing
    • Finding Concept-specific Biases in Form–Meaning Associations
    • How (Non-)Optimal is the Lexicon?
  • 25/2: Ryan is giving an invited talk at the ELLIS NLP Workshop on joint work with Clara and Tim

  • 28/1: 3 papers accepted to EACL 2021

    • Disambiguatory signals are stronger in word initial positions
    • Searching for Search Errors in Neural Morphological Inflection
    • Applying the Transformer to Character-level Transduction




Afra Amini

Master’s Student

ETH Zürich


Jiaoda Li

Master’s Student

ETH Zürich


Selena Pepić

Master’s Student

ETH Zürich

Frequent Collaborators


Isabelle Augenstein

Associate Professor of Computer Science

University of Copenhagen


Mans Hulden

Associate Professor of Linguistics

University of Colorado Boulder


Simone Teufel

Professor of Computer Science

University of Cambridge


*Alumna is the nominative plural of alumnum, which means student in Latin. This a gender neutral (or, better said, gender neuter) version of the gendered words alumni and alumnae.



Natural Language Processing

ETH Zürich Spring 2021
This course presents topics in natural language processing with an emphasis on modern techniques, primarily focusing on statistical and deep learning approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.

ESSLLI 2021: Information Theory in Linguistics

Virtual Summer 2021
The European Summer School in Logic, Language and Information Summer School (ESSLLI) provides an interdisciplinary setting in which courses and workshops are offered in logic, linguistics and computer science. The Information Theory in Linguistics course focuses on the application of information-theoretic methods to natural language processing, emphasizing interdisciplinary connections with the field of linguistics.

Thesis Projects

If you are a BSc or MSc student at ETH Zurich interested in writing your thesis with us, we would be delighted to hear from you! Unfortunately, we do not have the capacity to consider students from outside ETH for thesis projects. Our research revolves around theoretical and applied problems in Natural Language Processing, Computational Linguistics, Machine Learning and Statistics. To obtain a better understanding of what currently interests us, we invite you to check our recent publications. However, feel free to express interest in any topic you think our group might be well suited to advise you on: Just because we have not yet looked into a topic does not mean we are not interested in it or willing to become interested in the topic.
Please send an email to ryan.cotterell@inf.ethz.ch with CC to clara.meister@inf.ethz.ch and niklas.stoehr@inf.ethz.ch and state either [bachelor’s thesis] or [master’s thesis] at the start of the subject. For us to get to know you a little, please write a paragraph introducing yourself and why you are interested in working with us. It would help us a lot if you also provided a list of four or five more concrete topics that you are interested in. We will try our best to find a project that suits your interests. We are looking forward to receiving your inquiry!

Project Announcements

Numbers in NLP Niklas
NLP MSc theses with NZZ Cristina, one Rycolab member

ETHZ++ NLP Reading Group


Our lab, along with Mrinmaya Sachan’s lab, meets weekly on Fridays at 15:00 CEST. Here’s what we’re reading!

04/12/19: Semantic categories of artifacts and animals reflect efficient coding
11/12/19: SpanBERT: Improving Pre-training by Representing and Predicting Spans
15/01/20: Gender-Aware Reinflection using Linguistically Enhanced Neural Models

Joining Our Lab

Thank you very much for your interest in joining our group – we would be delighted to hear from you!

If you are interested in working with us as a Master’s student, please see here. Ryan has previously co-advised Master’s students on NLP topics with Mrinmaya Sachan and others, if co-advising is an option you would like to pursue. At Cambridge, Ryan co-advises MPhil students with Simone Teufel. We are looking forward to receiving your inquiry!

If you are interested in joining us as a PhD student: we do get a lot of emails about joining our lab from super interesting, well-qualified applicants who we would be very happy to have as research colleagues. However, there has been unbridled exuberance over joining our group and we grew quite quickly. Thus, for the foreseeable future, we can only accept new PhD students under one of the following circumstances:

  • Consider that ETHZ is part of the ELLIS PhD program, which supports PhDs and postdoctoral fellows by providing access to leading researchers through boot camps, summer schools and workshops of the ELLIS programs. You can apply for admission through the centralized application portal; the deadline is the 1st of December 2020. Our lab works closely with CopeNLU headed by Isabelle Augenstein at the University of Copenhagen; applicants who apply through ELLIS are encouraged to tag both Ryan and Isabelle as they are both part of the ELLIS network

  • Ryan is also core faculty member of the newly-established ETH AI Center which offers fully-funded PhD and Post-doctoral fellowships. Please apply here and describe how you intend to take an interdisciplinary perspective on NLP.

  • If you are interested in NLP at ETHZ and do not yet have a Master’s degree, please consider applying to ETHZ’s direct doctorate program. You should tag Mrinmaya Sachan and Ryan and that you are interested in NLP.

  • If you have a previous working relationship with members of our group or there is a co-advisor who has a previous working relationship with our lab.

Note on our research focus and collaboration style: we don’t do leaderboard science or chase state-of-the-art numbers on ephemeral datasets. If you’re okay with that, awesome! You’ll probably fit right in. I spend a lot of time with those I collaborate with. I am a hands-on advisor who writes large swathes of our papers. I am also known to have a strong opinion on modeling aesthetics, code style and experimental design. That may not be what you are looking for during your PhD, which is perfectly fine. This style of advising takes up a lot of my time, which is already stretched quite thin as it is, and, hence, the need to limit admissions. In many ways, I am more of a collaborator than a traditional advisor.