We are a collocation of collaborators working on a diverse range of topics in computational linguistics, natural language processing and machine learning.

Lab Motto: We put the fun in funicular!

Current Foci

  • Decoding Strategies for Neural MT
  • Information-Theoretic Linguistics
  • Computational Typology
  • Computational Morphology
  • Bias and Fairness in NLP Systems
  • Computational Approaches to Metaphor
  • Low-resource Linguistic Annotation
  • Algorithms for Parsing
  • Interpreting Neural Representations of Language
  • Computational Social Science
  • NLP Applications

Lab News

  • 19/07/22 David Mortensen (CMU) is giving an invited talk
  • 04/07/22 Noga Zaslavsky (MIT) is giving an invited talk
  • 10/06/22 Aaron Schein (University of Columbia) is giving an invited talk

  • 23/05/22 Ryan gives STIRS Keynote at ACL 2022

  • 24/02/22: 6 papers accepted to NAACL 2022

    • Same Neurons, Different Languages: Probing Morphosyntax in Multilingual Pre-trained Models
      Karolina Stańczak, Edoardo Ponti, Lucas Torroba Hennigen, Ryan Cotterell, Isabelle Augenstein
    • A Structured Span Selector
      Tianyu Liu, Yuchen Eleanor Jiang, Ryan Cotterell, Mrinmaya Sachan
    • BlonDe: An Automatic Evaluation Metric for Document-level Machine Translation
      Yuchen Eleanor Jiang, Tianyu Liu, Shuming Ma, Dongdong Zhang, Jian Yang, Haoyang Huang, Rico Sennrich, Ryan Cotterell, Mrinmaya Sachan, Ming Zhou
    • Exact Paired-Permutation Testing for Structured Test Statistics
      Ran Zmigrod, Tim Vieira, Ryan Cotterell
    • Probing via Prompting
      Jiaoda Li, Ryan Cotterell, Mrinmaya Sachan
    • A Word on Machine Ethics: A Response to Jiang et al. (2021)
      Zeerak Talat, Hagen Blix, Josef Valvoda, Maya Indira Ganesh, Ryan Cotterell, Adina Williams
  • 24/02/22: 6 papers accepted to ACL 2022

    • A Functionalist Account of Vowel System Typology
      Ryan Cotterell and Jason Eisner
    • High probability or low information? The probability–quality paradox in language generation
      Clara Isabel Meister, Gian Wiher, Tiago Pimentel, and Ryan Cotterell
    • Analyzing Wrap-Up Effects through an Information-Theoretic Lens
      Clara Meister, Tiago Pimentel, Thomas Hikaru Clark, Ryan Cotterell, and Roger P. Levy
    • Probing as Quantifying the Inductive Bias of Pre-trained Representations
      Alexander Immer, Lucas Torroba Hennigen, Vincent Fortuin, and Ryan Cotterell
    • Estimating the Entropy of Linguistic Distributions
      Aryaman Arora, Clara Isabel Meister, and Ryan Cotterell
    • Causal Probing for Grammatical Number: From Encoding to Usage
      Karim Lasri, Tiago Pimentel, Alessandro Lenci, Thierry Poibeau, and Ryan Cotterell
  • Spring 2022: Ryan is giving invited talks at Stanford, Google Research and KAUST; Clara is giving an invited talk at Instituto Superior Técnico.

  • Fall 2021: Ryan is giving invited talks at EPFL, University of Melbourne, University of Notre Dame, Gothenburg University and MIT.

  • 26/08/21: 7 papers accepted to EMNLP 2021; 4 papers accepted to Findings of the ACL: EMNLP 2021

    • Revisiting the Uniform Information Density Hypothesis
    • A Bayesian Framework for Information-Theoretic Probing
    • Efficient Sampling of Dependency Structure
    • Phone-level Uniform Information Density across and within Languages
    • On Homophony and Rényi Entropy
    • Conditional Poisson Stochastic Beams
    • Text or Topology? Classifying Ally-Enemy Pairs in Militarised Conflict
    • Equivariant Transduction through Invariant Alignment
    • Searching for More Efficient Dynamic Programs
    • Keyword2Text: A Plug-and-Play Method for Controlled Text Generation
    • Adjusting the Conflict-Cooperation Scale for Armed Conflict Assessment
  • 05/05/21: 7 papers accepted at ACL 2021; 1 paper accepted to Findings of the ACL

    • A cognitive regularizer for language modeling
    • On Finding the K-best Non-projective Dependency Trees
    • Determinantal Beam Search
    • Is Sparse Attention more Interpretable?
    • Language Model Evaluation Beyond Perplexity
    • Examining the Inductive Bias of Neural Language Models with Artificial Languages
    • Higher-order Derivatives of Weighted Finite-state Machines
    • Modelling the Unigram Distribution
  • 10/03/21: 5 papers (3 long, 2 short) accepted to NAACL 2021

    • A Non-Linear Structural Probe
    • What About the Precedent: An Information-Theoretic Analysis of Common Law
    • Do Syntactic Probes Probe Syntax? Experiments with Jabberwocky Probing
    • Finding Concept-specific Biases in Form–Meaning Associations
    • How (Non-)Optimal is the Lexicon?





Teaching

SEE ALL CLASSES

Philosophy of Language and Computation

ETH Zürich Autumn 2022
This graduate class, taught like a seminar, is designed to help you understand the philosophical underpinnings of modern work in natural language processing (NLP), most of which centered around statistical machine learning applied to natural language data.

Advanced Formal Language Theory

ETH Zürich Spring 2022
This course serves as an introduction to weighted formal language theory. The lectures cover the theory and algorithms used to manipulate and compute with weighted automata and grammars in detail. The emphasis is on rigor and depth rather than broad coverage. To motivate the theory, the course will also cover various applications of formal language theory to modern-day machine learning. Students should expect a healthy dose of proof-writing and, thus, mathematical maturity is expected. In terms of background, the class will draw on techniques from discrete math, analysis, and linear algebra. While there are no hard prerequisites, having taken a class that covers basic graph algorithms will be helpful as well as familiarity with basic real analysis and linear algebra.

Natural Language Processing

ETH Zürich Fall 2021
This course presents topics in natural language processing with an emphasis on modern techniques, primarily focusing on statistical and deep learning approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.

Dependency Structures and Lexicalized Grammars

ETH Zürich Fall 2021
Dependency parsing is a fundamental task in natural language processing. This seminar explores a variety of algorithms for efficient dependency parsing and their derivatioin in a unified algebraic framework. The core ideas behind the mathematics of dependency parsing are explored.

ESSLLI 2021: Information Theory in Linguistics

Virtual Summer 2021
The European Summer School in Logic, Language and Information Summer School (ESSLLI) provides an interdisciplinary setting in which courses and workshops are offered in logic, linguistics and computer science. The Information Theory in Linguistics course focuses on the application of information-theoretic methods to natural language processing, emphasizing interdisciplinary connections with the field of linguistics.

Thesis Projects

If you are a MSc student at ETH Zurich interested in writing your thesis with us, we would be delighted to hear from you! Unfortunately, we do not have the capacity to consider students from outside ETH for thesis projects. Our research revolves around theoretical and applied problems in Natural Language Processing, Computational Linguistics, Machine Learning and Statistics. To obtain a better understanding of what currently interests us, we invite you to check our recent publications. However, feel free to express interest in any topic you think our group might be well suited to advise you on: Just because we have not yet looked into a topic does not mean we are not interested in it or willing to become interested in the topic.
 
Please send an email to ryan.cotterell@inf.ethz.ch with CC to clara.meister@inf.ethz.ch and niklas.stoehr@inf.ethz.ch and state either [bachelor’s thesis] or [master’s thesis] at the start of the subject. For us to get to know you a little, please write a paragraph introducing yourself and why you are interested in working with us. It would help us a lot if you also provided a list of four or five more concrete topics that you are interested in. We will try our best to find a project that suits your interests. We are looking forward to receiving your inquiry!

Available Projects

Joining Our Lab

Thank you very much for your interest in joining our group – unfortunately, we are not accepting PhD students anymore!

If you are interested in working with us as a Master’s student, please see here. Ryan has previously co-advised Master’s students on NLP topics with Mrinmaya Sachan and others, if co-advising is an option you would like to pursue. At Cambridge, Ryan co-advises MPhil students with Simone Teufel. We are looking forward to receiving your inquiry!