Talk Info: Diyi Yang
Title: Natural Language Processing with Less Data and More Structures
Abstract: Recently, natural language processing (NLP) has had increasing success and produced extensive industrial applications. Despite being sufficient to enable these applications, current NLP systems often ignore the structures of language and heavily rely on massive labeled data. In this talk, we take a closer look at the interplay between language structures and computational methods via two lines of work. The first one studies how to incorporate linguistically-informed relations between different training data to help both text classification and sequence labeling tasks when annotated data is limited. The second part demonstrates how various structures in conversations can be utilized to generate better dialog summaries for everyday interaction.
Bio: Diyi Yang is an assistant professor in the School of Interactive Computing at Georgia Tech, also affiliated with the Machine Learning Center (ML@GT) at Georgia Tech. She is broadly interested in Computational Social Science, and Natural Language Processing. Diyi received her PhD from the Language Technologies Institute at Carnegie Mellon University, and her bachelor’s degree from Shanghai Jiao Tong University, China. Her work has been published at leading NLP/HCI conferences, and also resulted in multiple award nominations from EMNLP 2015, ICWSM 2016, SIGCHI 2019, CSCW 2020, and SIGCHI 2021. She is named in the Science category of Forbes 30 under 30 list in 2020, a recipient of IEEE AI 10 to Watch in 2020, and has received faculty research awards from Amazon, Facebook and Salesforce.