We are a collocation of collaborators working on a diverse range of topics in computational linguistics, natural language processing and machine learning.

Lab Motto: We put the fun in funicular!

Current Foci

  • Decoding Strategies for Neural MT
  • Information-Theoretic Linguistics
  • Computational Typology
  • Computational Morphology
  • Bias and Fairness in NLP Systems
  • Computational Approaches to Metaphor
  • Low-resource Linguistic Annotation
  • Algorithms for Parsing
  • Interpreting Neural Representations of Language
  • Computational Social Science
  • NLP Applications

Lab News

  • 13/02/23 Ryan is giving an invited talk at the Dagstuhl Seminar.
  • 19/07/22 David Mortensen (CMU) is giving an invited talk
  • 04/07/22 Noga Zaslavsky (MIT) is giving an invited talk
  • 10/06/22 Aaron Schein (University of Columbia) is giving an invited talk

  • 23/05/22 Ryan gives STIRS Keynote at ACL 2022

  • 24/02/22: 6 papers accepted to NAACL 2022

    • Same Neurons, Different Languages: Probing Morphosyntax in Multilingual Pre-trained Models
      Karolina Stańczak, Edoardo Ponti, Lucas Torroba Hennigen, Ryan Cotterell, Isabelle Augenstein
    • A Structured Span Selector
      Tianyu Liu, Yuchen Eleanor Jiang, Ryan Cotterell, Mrinmaya Sachan
    • BlonDe: An Automatic Evaluation Metric for Document-level Machine Translation
      Yuchen Eleanor Jiang, Tianyu Liu, Shuming Ma, Dongdong Zhang, Jian Yang, Haoyang Huang, Rico Sennrich, Ryan Cotterell, Mrinmaya Sachan, Ming Zhou
    • Exact Paired-Permutation Testing for Structured Test Statistics
      Ran Zmigrod, Tim Vieira, Ryan Cotterell
    • Probing via Prompting
      Jiaoda Li, Ryan Cotterell, Mrinmaya Sachan
    • A Word on Machine Ethics: A Response to Jiang et al. (2021)
      Zeerak Talat, Hagen Blix, Josef Valvoda, Maya Indira Ganesh, Ryan Cotterell, Adina Williams
  • 24/02/22: 6 papers accepted to ACL 2022

    • A Functionalist Account of Vowel System Typology
      Ryan Cotterell and Jason Eisner
    • High probability or low information? The probability–quality paradox in language generation
      Clara Isabel Meister, Gian Wiher, Tiago Pimentel, and Ryan Cotterell
    • Analyzing Wrap-Up Effects through an Information-Theoretic Lens
      Clara Meister, Tiago Pimentel, Thomas Hikaru Clark, Ryan Cotterell, and Roger P. Levy
    • Probing as Quantifying the Inductive Bias of Pre-trained Representations
      Alexander Immer, Lucas Torroba Hennigen, Vincent Fortuin, and Ryan Cotterell
    • Estimating the Entropy of Linguistic Distributions
      Aryaman Arora, Clara Isabel Meister, and Ryan Cotterell
    • Causal Probing for Grammatical Number: From Encoding to Usage
      Karim Lasri, Tiago Pimentel, Alessandro Lenci, Thierry Poibeau, and Ryan Cotterell
  • Spring 2022: Ryan is giving invited talks at Stanford, Google Research and KAUST; Clara is giving an invited talk at Instituto Superior Técnico.

  • Fall 2021: Ryan is giving invited talks at EPFL, University of Melbourne, University of Notre Dame, Gothenburg University and MIT.

  • 26/08/21: 7 papers accepted to EMNLP 2021; 4 papers accepted to Findings of the ACL: EMNLP 2021

    • Revisiting the Uniform Information Density Hypothesis
    • A Bayesian Framework for Information-Theoretic Probing
    • Efficient Sampling of Dependency Structure
    • Phone-level Uniform Information Density across and within Languages
    • On Homophony and Rényi Entropy
    • Conditional Poisson Stochastic Beams
    • Text or Topology? Classifying Ally-Enemy Pairs in Militarised Conflict
    • Equivariant Transduction through Invariant Alignment
    • Searching for More Efficient Dynamic Programs
    • Keyword2Text: A Plug-and-Play Method for Controlled Text Generation
    • Adjusting the Conflict-Cooperation Scale for Armed Conflict Assessment
  • 05/05/21: 7 papers accepted at ACL 2021; 1 paper accepted to Findings of the ACL

    • A cognitive regularizer for language modeling
    • On Finding the K-best Non-projective Dependency Trees
    • Determinantal Beam Search
    • Is Sparse Attention more Interpretable?
    • Language Model Evaluation Beyond Perplexity
    • Examining the Inductive Bias of Neural Language Models with Artificial Languages
    • Higher-order Derivatives of Weighted Finite-state Machines
    • Modelling the Unigram Distribution
  • 10/03/21: 5 papers (3 long, 2 short) accepted to NAACL 2021

    • A Non-Linear Structural Probe
    • What About the Precedent: An Information-Theoretic Analysis of Common Law
    • Do Syntactic Probes Probe Syntax? Experiments with Jabberwocky Probing
    • Finding Concept-specific Biases in Form–Meaning Associations
    • How (Non-)Optimal is the Lexicon?





Teaching

SEE ALL CLASSES

Advanced Formal Language Theory

ETH Zürich Spring 2023
This course serves as an introduction to various advanced topics in formal language theory. The primary focus of the course is on weighted formalisms, which can easily be applied in machine learning. Topics include finite-state machines as well as the algorithms that are commonly used for their manipulation. We will also cover weighted context-free grammars, weighted tree automata, and weighted mildly context-sensitive formalisms.

Large Language Models

ETH Zürich Spring 2023
Large language models have become one of the most commonly deployed NLP inventions. In the past half-decade, their integration into core natural language processing tools has dramatically increased the performance of such tools, and they have entered the public discourse surrounding artificial intelligence. In this course, we start with the probabilistic foundations of language models, i.e., covering what constitutes a language model from a formal, theoretical perspective. We then discuss how to construct and curate training corpora, and introduce many of the neural-network architectures often used to instantiate language models at scale. The course covers aspects of systems programming, discussion of privacy and harms, as well as applications of language models in NLP and beyond.

Philosophy of Language and Computation II

ETH Zürich Spring 2023
This graduate class, taught like a seminar, is designed to help you understand the philosophical underpinnings of modern work in natural language processing (NLP), most of which centered around statistical machine learning applied to natural language data.

ESSLLI 2023 Tutorial: Formal Language Theory and Neural Networks

ETH Zürich Spring 2023

Thesis Projects

If you are a MSc student at ETH Zurich interested in writing your thesis with us, we would be delighted to hear from you! Unfortunately, we do not have the capacity to consider students from outside ETH for thesis projects. Our research revolves around theoretical and applied problems in Natural Language Processing, Computational Linguistics, Machine Learning and Statistics. To obtain a better understanding of what currently interests us, we invite you to check our recent publications. However, feel free to express interest in any topic you think our group might be well suited to advise you on: Just because we have not yet looked into a topic does not mean we are not interested in it or willing to become interested in the topic.
 
Please send an email to ryan.cotterell@inf.ethz.ch with CC to clara.meister@inf.ethz.ch and niklas.stoehr@inf.ethz.ch and state either [bachelor’s thesis] or [master’s thesis] at the start of the subject. For us to get to know you a little, please write a paragraph introducing yourself and why you are interested in working with us. It would help us a lot if you also provided a list of four or five more concrete topics that you are interested in. We will try our best to find a project that suits your interests. We are looking forward to receiving your inquiry!

Joining Our Lab

Thank you very much for your interest in joining our group – unfortunately, we are not accepting PhD students anymore!

If you are interested in working with us as a Master’s student, please see here. Ryan has previously co-advised Master’s students on NLP topics with Mrinmaya Sachan and others, if co-advising is an option you would like to pursue. At Cambridge, Ryan co-advises MPhil students with Simone Teufel. We are looking forward to receiving your inquiry!