2020年1月24日

MSR Cambridge Lab Lecture: Deep (Inter-)Active Learning for NLP: Cure-all or Catastrophe?

15:00-16:00

地点: MSR Cambridge

While deep learning produces supervised models with unprecedented predictive performance on many tasks, under typical training procedures, advantages over classical methods emerge only with large datasets. The extreme data-dependence of reinforcement learners may be even more problematic. Millions of experiences sampled from video-games come cheaply, but human-interacting systems can’t afford to waste so much labor. In this talk, I will discuss several efforts to increase the labor-efficiency of learning from human interactions. Specifically, I will cover work on learning dialogue policies, deep active learning for natural language processing, learning from noisy and singly-labeled data, and active learning with partial feedback. Finally, time permitting, I’ll discuss a new approach for reducing the reliance of NLP models on spurious associations in the data that relies on a new mechanism for interacting with annotators.