Fast continuous and discrete time approaches for smooth and nonsmooth optimization featuring time scaling and Tikhonov regularization

13.01.2025 09:30 - 11:00

Mikhail Karapetyants (University of Vienna)

Abstract:
The following thesis is devoted to solving the classical optimization problem of minimizing a convex function via different methods in continuous and discrete time. Both smooth and nonsmooth cases of the objective functions will be discussed. The main focus throughout this manuscript is to obtain fast convergence rates for the function values combined with the convergence of the trajectories of the dynamical systems or iterates of the algorithms. Major attention is paid to the Tikhonov regularization technique, which is known to help
improve the weak convergence of the trajectories (iterates) to the controlled (to the element of the minimal norm from the set of all minimizers of the objective function) strong one.
In some cases, time scaling will be applied to accelerate the function values’ convergence. Various numerical experiments are presented to improve our understanding of theoretical
results.
Key words: Smooth and nonsmooth convex optimization; Damped inertial dynamics;
Hessian-driven damping; Time scaling; Moreau envelope; Proximal operator; Tikhonov
regularization; Strong convergence; Inertial algorithms.

Zoom-Link:

univienna.zoom.us/j/69370747168
a.1
Meeting ID: 693 7074 7168
Kenncode: 159032

Organiser:

Fakultät für Mathematik, Dekan Radu Ioan Boţ

Location:
Zoom