This talk is part of the NLP Seminar Series.

Cross-lingual representation learning

Alexis Conneau, Facebook AI Research
Date: 11:00 am - 12:00 pm, Oct 17th 2019
Venue: Room 219 (open space), Gates Computer Science Building

Abstract

In this talk, I will present some of my contributions in the unsupervised alignment of word and sentence representations in many languages. I will first talk about the unsupervised alignment of word embeddings and how this step made unsupervised machine translation possible. I will then focus particularly on multilingual masked language models (MLM), and show their surprising effectiveness for pretraining cross-lingual classification, sequence labeling and machine translation models. I will also present further analysis of these models and discuss recent work on what makes multilingual MLM models multilingual. Finally, I will conclude by presenting some of the NLU challenges we face at Facebook.

Bio

Alexis Conneau is a research scientist at Facebook AI, who recently completed his PhD at FAIR Paris in 2019. He previously graduated from Ecole Polytechnique in Mathematics in 2015. His research revolves around unsupervised learning, transfer learning for NLU, and cross-lingual understanding for low-resource languages, and his past work in these areas includes InferSent, MUSE, XNLI and XLM. Alexis published articles in the main NLP and machine learning conferences including ACL, EMNLP, ICLR and NeurIPS. He received an outstanding paper award at EMNLP 2017, and was a co-author of the EMNLP 2018 best paper award.