Nesterov, Yurii
[UCL]
In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, and another is general but simple and its structure is known. Despite to the bad properties of the sum, such problems, both in convex and nonconvex cases, can be solved with efficiency typical for the good part of the objective. For convex problems of the above structure, we consider primal and dual variants of the gradient method (converge as O (1/k)), and an accelerated multistep version with convergence rate O (1/k2), where k is
the iteration counter. For all methods, we suggest some efficient "line search" procedures and show that the additional computational work necessary for estimating the unknown problem class parameters can only multiply the complexity of each iteration by a small constant factor. We present also the results of preliminary computational experiments, which confirm the superiority of the accelerated scheme.
Bibliographic reference |
Nesterov, Yurii. Gradient methods for minimizing composite objective function. CORE Discussion Papers ; 2007/76 (2007) |
Permanent URL |
http://hdl.handle.net/2078.1/5122 |