Research talk: Computationally efficient large-scale AI
- Song Han | MIT
- Microsoft Research Summit 2021 | Deep Learning & Large-Scale AI
Today’s AI is too big. Deep neural networks demand extraordinary levels of computation, and therefore power and carbon, for training and inference. In this research talk, Song Han, MIT, presents TinyML and efficient deep learning techniques that make AI greener, smaller, faster, and deployable on IoT devices.
Learn more about the 2021 Microsoft Research Summit: https://Aka.ms/researchsummit (opens in new tab)
Deep Learning & Large-Scale AI
-
Opening remarks: Deep Learning and Large-Scale AI
- Ahmed Awadallah
-
-
-
-
-
-
Roundtable discussion: Efficient and adaptable large-scale AI
- Ahmed Awadallah,
- Jianfeng Gao,
- Danqi Chen
-
-
-
-
Panel: Large-scale neural platform models: Opportunities, concerns, and directions
- Eric Horvitz,
- Miles Brundage,
- Yejin Choi
-
-
-
Research talk: WebQA: Multihop and multimodal
- Yonatan Bisk
-
-
Roundtable discussion: Beyond language models: Knowledge, multiple modalities, and more
- Yonatan Bisk,
- Daniel McDuff,
- Dragomir Radev
-
Closing remarks: Deep Learning and Large Scale AI
- Jianfeng Gao