Home
News
People
Publications
Teaching
Thesis Projects
Reading Group
Talks
Joining
Anej Svete
Latest
On Efficiently Representing Regular Languages as RNNs
Transformers Can Represent n-gram Language Models
Lower Bounds on the Expressivity of Recurrent Neural Language Models
On the Representational Capacity of Neural Language Models with Chain-of-Thought Reasoning
What Languages are Easy to Language-Model? A Perspective from Learning Probabilistic Regular Languages
On the Representational Capacity of Recurrent Neural Language Models
Recurrent Neural Language Models as Probabilistic Finite-state Automata
Algorithms for Weighted Finite-State Automata with Failure Arcs
Cite
×