Avoiding Saddle Points in Nonsmooth Optimization

29.11.2021 15:30 - 16:30

Damek Davis (Cornell University)

Abstract: This talk asks: Do common first-order methods avoid saddle points of nonsmooth functions? Our main result shows that for a "generic" class of semialgebraic functions, randomly initialized proximal methods and stochastically perturbed subgradient methods converge only to local minimizers.

Organiser:
R. I. Boț (U Wien), S. Sabach (Technion - Israel Institute of Technology Haifa), M. Staudigl (Maastricht U)
Location:
Zoom Meeting