Roundtable discussion: Efficient and adaptable large-scale AI
- Ahmed Awadallah, Jianfeng Gao, Danqi Chen, Song Han | Microsoft Research Redmond, Microsoft Research Redmond, Princeton University, MIT
- Microsoft Research Summit 2021 | Deep Learning & Large-Scale AI
The AI landscape has been transformed by the advent of large-scale models like BERT, Turing, and, most recently, GPT-3. Researchers have brought language models to new heights in terms of performance, propelling advancements in search, language translation, and more. As models are growing in size and capability, their applications are also expanding. But they are also becoming harder to deploy and adopt efficiently. Join us for a roundtable discussion hosted by Senior Principal Researcher Ahmed Awadallah to discuss the many ways we can improve efficiency and adaptability in next-generation AI. He will be joined by Microsoft Distinguished Scientist Jianfeng Gao, Princeton University Assistant Professor Danqi Chen, and Song Han, Assistant Professor in MIT’s Electrical Engineering and Computer Science program.
Learn more about the 2021 Microsoft Research Summit: https://Aka.ms/researchsummit (opens in new tab)
-
-
Ahmed Awadallah
Partner Research Manager
-
Danqi Chen
Assistant Professor
Princeton University
-
Jianfeng Gao
Distinguished Scientist & Vice President
-
-
-
Deep Learning & Large-Scale AI
-
Opening remarks: Deep Learning and Large-Scale AI
- Ahmed Awadallah
-
-
-
-
-
-
Roundtable discussion: Efficient and adaptable large-scale AI
- Ahmed Awadallah,
- Jianfeng Gao,
- Danqi Chen
-
-
-
-
Panel: Large-scale neural platform models: Opportunities, concerns, and directions
- Eric Horvitz,
- Miles Brundage,
- Yejin Choi
-
-
-
Research talk: WebQA: Multihop and multimodal
- Yonatan Bisk
-
-
Roundtable discussion: Beyond language models: Knowledge, multiple modalities, and more
- Yonatan Bisk,
- Daniel McDuff,
- Dragomir Radev
-
Closing remarks: Deep Learning and Large Scale AI
- Jianfeng Gao