We are a collocation of collaborators working on a diverse range of topics in computational linguistics, natural language processing and machine learning.
Lab Motto: We put the fun in funicular!
24/2: 6 papers accepted to NAACL 2022
24/2: 6 papers accepted to ACL 2022
Spring 2022: Ryan is giving invited talks at Stanford, Google Research and KAUST; Clara is giving an invited talk at Instituto Superior Técnico.
Fall 2021: Ryan is giving invited talks at EPFL, University of Melbourne, University of Notre Dame, Gothenburg University and MIT.
26/8: 7 papers accepted to EMNLP 2021; 4 papers accepted to Findings of the ACL: EMNLP 2021
5/5: 7 papers accepted at ACL 2021; 1 paper accepted to Findings of the ACL
10/3: 5 papers (3 long, 2 short) accepted to NAACL 2021
ETH Zürich Spring 2022
This course serves as an introduction to weighted formal language theory. The lectures cover the theory and algorithms used to manipulate and compute with weighted automata and grammars in detail. The emphasis is on rigor and depth rather than broad coverage. To motivate the theory, the course will also cover various applications of formal language theory to modern-day machine learning. Students should expect a healthy dose of proof-writing and, thus, mathematical maturity is expected. In terms of background, the class will draw on techniques from discrete math, analysis, and linear algebra. While there are no hard prerequisites, having taken a class that covers basic graph algorithms will be helpful as well as familiarity with basic real analysis and linear algebra.
ETH Zürich Fall 2021
This course presents topics in natural language processing with an emphasis on modern techniques, primarily focusing on statistical and deep learning approaches. The course provides an overview of the primary areas of research in language processing as well as a detailed exploration of the models and techniques used both in research and in commercial natural language systems.
ETH Zürich Fall 2021
Dependency parsing is a fundamental task in natural language processing. This seminar explores a variety of algorithms for efficient dependency parsing and their derivatioin in a unified algebraic framework. The core ideas behind the mathematics of dependency parsing are explored.
Virtual Summer 2021
The European Summer School in Logic, Language and Information Summer School (ESSLLI) provides an interdisciplinary setting in which courses and workshops are offered in logic, linguistics and computer science. The Information Theory in Linguistics course focuses on the application of information-theoretic methods to natural language processing, emphasizing interdisciplinary connections with the field of linguistics.
If you are a MSc student at ETH Zurich interested in writing your thesis with us, we would be delighted to hear from you! Unfortunately, we do not have the capacity to consider students from outside ETH for thesis projects. Our research revolves around theoretical and applied problems in Natural Language Processing, Computational Linguistics, Machine Learning and Statistics. To obtain a better understanding of what currently interests us, we invite you to check our recent publications. However, feel free to express interest in any topic you think our group might be well suited to advise you on: Just because we have not yet looked into a topic does not mean we are not interested in it or willing to become interested in the topic.
Please send an email to ryan.cotterell@inf.ethz.ch with CC to clara.meister@inf.ethz.ch and niklas.stoehr@inf.ethz.ch and state either [bachelor’s thesis] or [master’s thesis] at the start of the subject. For us to get to know you a little, please write a paragraph introducing yourself and why you are interested in working with us. It would help us a lot if you also provided a list of four or five more concrete topics that you are interested in. We will try our best to find a project that suits your interests. We are looking forward to receiving your inquiry!
Our lab, along with Mrinmaya Sachan’s lab, meets weekly on Fridays at 15:00 CEST. Here’s what we’re reading!
04/12/19: Semantic categories of artifacts and animals reflect efficient coding
11/12/19: SpanBERT: Improving Pre-training by Representing and Predicting Spans
15/01/20: Gender-Aware Reinflection using Linguistically Enhanced Neural Models
Thank you very much for your interest in joining our group – unfortunately, we are not accepting PhD students anymore!
If you are interested in working with us as a Master’s student, please see here. Ryan has previously co-advised Master’s students on NLP topics with Mrinmaya Sachan and others, if co-advising is an option you would like to pursue. At Cambridge, Ryan co-advises MPhil students with Simone Teufel. We are looking forward to receiving your inquiry!