Taylor, Adrien
[UCL]
Hendrickx, Julien
[UCL]
Glineur, François
[UCL]
(eng)
We provide a framework for computing the exact worst-case performance of any algorithm belonging to a broad class of oracle-based first-order methods for composite convex optimization, including those performing explicit, projected, proximal, conditional, and inexact (sub)gradient steps. We simultaneously obtain tight worst-case guarantees and explicit instances of optimization problems on which the algorithm reaches this worst-case. We achieve this by reducing the computation of the worst-case to solving a convex semidefinite program, generalizing previous works on performance estimation by Drori and Teboulle [Math. Program., 145 (2014), pp. 451--482] and the authors [A. B. Taylor, J. M. Hendrickx, and F. Glineur, Math. Program., 161 (2017), pp. 307--345]. We use these developments to obtain a tighter analysis of the proximal point algorithm and of several variants of fast proximal gradient, conditional gradient, subgradient, and alternating projection methods. In particular, we present a new analytical worst-case guarantee for the proximal point algorithm that is twice better than previously known and improve the standard worst-case guarantee for the conditional gradient method by more than a factor of two. We also show how the optimized gradient method proposed by Kim and Fessler [Math. Program., 159 (2016), pp. 81--107] can be extended by incorporating a projection or a proximal operator, which leads to an algorithm that converges in the worst-case twice as fast as the standard accelerated proximal gradient method [A. Beck and M. Teboulle, SIAM J. Imaging Sci., 2 (2009), pp. 183--202].
Bibliographic reference |
Taylor, Adrien ; Hendrickx, Julien ; Glineur, François. Exact Worst-case Performance of First-order Methods for Composite Convex Optimization. In: SIAM Journal on Optimization, Vol. 27, no. 3, p. 1283-1313 (2017) |
Permanent URL |
http://hdl.handle.net/2078.1/180603 |