Abstract: This talk asks: Do common first-order methods avoid saddle points of nonsmooth functions? Our main result shows that for a "generic" class of semialgebraic functions, randomly initialized proximal methods and stochastically perturbed subgradient methods converge only to local minimizers.
Avoiding Saddle Points in Nonsmooth Optimization
29.11.2021 15:30 - 16:30
Organiser:
R. I. Boț (U Wien), S. Sabach (Technion - Israel Institute of Technology Haifa), M. Staudigl (Maastricht U)
Location:
Zoom Meeting
Verwandte Dateien
- abstract_Davis_OWOS.pdf 142 KB