Overview
Deep networks have shown outstanding scaling properties both in terms of data and model sizes: larger performs better. Unfortunately, the computational cost of current state-of-the-art methods is prohibitive. A number of new techniques have recently arisen to address and improve this fundamental quality-cost trade-off. Methods like conditional computation, adaptive computation, dynamic model sparsification, and early-exit approaches aim to address the above mentioned quality-cost trade off. This workshop explores such exciting and practically-relevant research avenues. As part of contributed content we will invite high-quality papers on the following topics: dynamic routing, mixture-of-experts models, early-exit methods, conditional computations, capsules and object-oriented learning, reusable components, online network growing and pruning, online neural architecture search and applications of dynamic networks (continual learning, wireless/embedded devices and similar topics).
The 1st Dynamic Neural Networks workshop will be a hybrid workshop at ICML 2022 on July 22, 2022. Our goal is to advance the general discussion of the topic by highlighting contributions proposing innovative approaches regarding dynamic neural networks.
Announcements
- Workshop schedule is announced!
- Accepted papers (poster and oral presentations) are announced. Congratulations to all authors!
- Microsoft CMT Submission portal is now open!
Submission Deadline: May 31, 2022 (Anywhere on Earth)
Author Notification: June 13, 2022
Video Deadline: June 28th, 2022
Camera Ready Deadline: July 9, 2022
Workshop Day: July 22, 2022
Speakers (More Info)
Invited Speakers
Organizers
Panel Chairs
Program Committee
- Canwen Xu, UC San Diego
- Yigitcan Kaya, University of Maryland
- Maciej Wolczyk, Jagiellonian University
- Bartosz Wojcik, Jagiellonian University
- Yoshitomo Matsubara, Amazon Alexa AI
- Thomas Verelst, KU Leuven
Contact: icmldynamicnn@gmail.com