Conference Week (AISATS & ICML 2022)

Event

OSI Lab Conference Week (AISTATS & ICML 2022)

Short summary

This seminar covers some recent progress in optimization theory and deep learning theory. Specifically, for the optimization theory, I cover continuous-time analysis of accelerated optimization algorithms, and for the deep learning theory, I cover the so-called “edge of stability”.

Abstract

THe talk is divided into two parts. The first part, An Introduction to Continuous-Time Analysis of Accelerated Optimization Algorithms and Its Recent Progress, deals with analyzing accelerated optimization algorithms such as Nesterov momentum method or Polyak heavy-ball method from (second-order) ODE perspective. The first paper from SNU Math proposes a novel change of coordinate that provides a unified framework for deriving the convergence rates of the optimization ODEs via conservation of “energy”. The second paper provides a novel insight into the stability of the optimization ODEs from control-theoretic perspective, specifically by leveraging the contraction theory. The last two papers deal with some progress on (accelerated) optimization methods on Riemannian manifolds.

The second part, Recent Progress on Edge of Stability, deals with understanding the recently-found phenemonen called “Edge of Stability”, first proposed in (Cohen et al., ICLR'21). The first paper gives the first insights into why such phenomenon occurs, and what are its main features. The second paper tackles this in a more principled and theoretical manner.

Papers

Papers discussed in the seminar.

Part 1:

  • Jaewook J. Suh, Gyumin Roh, and Ernest K. Ryu. Continuous-Time Analysis of Accelerated Gradient Methods via Conservation Laws in Dilated Coordinate Systems. In ICML 2022.
  • Pedro Cisneros-Velarde and Francesco Bullo. A Contraction Theory Approach to Optimization Algorithms from Acceleration Flows. In AISTATS 2022.
  • Jungbin Kim and Insoon Yang. Accelerated Gradient Methods for Geodesically Convex Optimization. In ICML 2022.
  • Pierre Ablin and Gabriel Peyre. Fast and accurate optimization on the orthogonal manifold without retraction. In AISTATS 2022.

Part 2:

  • Kwangjun Ahn, Jingzhao Zhang, and Suvrit Sra. Understanding the Unstable Convergence of Gradient Descent. In ICML 2022.
  • Sanjeev Arora, Zhiyuan Li, and Abhishek Panigrahi. Understanding Gradient Descent on the Edge of Stability in Deep Learning. In ICML 2022.
Previous
Next