We explore new splitting methods emerging from the convex degenerate optimization framework, where preconditioners are allowed to be positive semidefinite, with potentially large kernels. First, we investigate abstract degenerate proximal point methods, establishing weak convergence results under mild assumptions and a systematic reduction of storage requirements resulting from the degeneracy of the preconditioner. The proposed framework leads us to novel graph-based extensions of the Douglas-Rachford Splitting method to sums of several operators, admitting distributed implementations over any given network. Then, we delve into degenerate forward-backward schemes, leading to an intriguing hybrid method that combines the advantages of the proximal-gradient and generalized conditional-gradient scheme. This novel approach efficiently addresses a new model for data-driven parameter tuning in total variation denoising, which is interesting per se and will be presented thoroughly.
Splitting Methods in Convex Degenerate Optimization with Applications
12.12.2023 11:30 - 12:15
Organiser:
R.I. Bot, E.R. Csetnek, Y. Malitskyi, H. Schichl
Location: