This talk is part of the NLP Seminar Series.

When Social Context Meets NLP: Learning with Less Data and More Structures

Diyi Yang, Georgia Tech
Date: 10:00am - 11:00am PT, Nov 12 2020
Venue: Zoom (link hidden)

Abstract

Recently, natural language processing (NLP) has had increasing success and produced extensive industrial applications. Despite being sufficient to enable these applications, current NLP systems often ignore the structures of language and heavily rely on massive labeled data. In this talk, we take a closer look at the interplay between language structures and computational methods via three lines of work. The first one studies what makes language persuasive by introducing a semi-supervised method to leverage hierarchical structures in text to recognize persuasion strategies in good-faith requests on crowdfunding platforms. We then show how to incorporate linguistically-informed relations between different training data to help both text classification and sequence labeling tasks when annotated data is limited. The last part demonstrates how various structures in conversations can be utilized to generate better summaries for everyday interaction.

Bio

Diyi Yang is an assistant professor in the School of Interactive Computing at Georgia Tech. She is broadly interested in Computational Social Science, and Natural Language Processing, with the goal of modeling human communication in social context and building socially-aware intelligent systems to support social interaction at scale. Diyi received her PhD from the Language Technologies Institute at Carnegie Mellon University, and her bachelor's degree from Shanghai Jiao Tong University, China. Her work has been published at leading NLP/HCI conferences, and also resulted in multiple award nominations from EMNLP 2015, ICWSM 2016, SIGCHI 2019 and CSCW 2020.