• A
• A
• A
• ABC
• ABC
• ABC
• А
• А
• А
• А
• А
Regular version of the site

## Adaptive Catalyst for Smooth Convex Optimization

Optimization and Control. Working papers by Cornell University., 2019
Иванова А. С., Gasnikov A., Pasechnyuk D., Grishchenko D., Shulgin E., Matyukhin V.
In this paper, we present a generic framework that allows accelerating almost arbitrary non-accelerated deterministic and randomized algorithms for smooth convex optimization problems. The main approach of our envelope is the same as in Catalyst (Lin et al., 2015): an accelerated proximal outer gradient method, which is used as an envelope for a non-accelerated inner method for the ℓ2 regularized auxiliary problem. Our algorithm has two key differences: 1) easily verifiable stopping criteria for inner algorithm; 2) the regularization parameter can be tunned along the way. As a result, the main contribution of our work is a new framework that applies to adaptive inner algorithms: Steepest Descent, Adaptive Coordinate Descent, Alternating Minimization. Moreover, in the non-adaptive case, our approach allows obtaining Catalyst without a logarithmic factor, which appears in the standard Catalyst (Lin et al., 2015, 2018).