Project

Understanding Language Representations in Deep Learning Models

Our goal is to explore language representations in computational models. We develop new models for representing natural language and investigate how existing models learn language, focusing on neural network models in key tasks like machine translation and speech recognition.

Members

Publications