Huang, Wen
[Department of Computational and Applied Mathematics,Rice University, Houston, TX, USA]
Absil, Pierre-Antoine
[UCL]
Gallivan, K. A.
[Department of Mathematics, Florida State University, Tallahassee FL,USA]
In this paper, a Riemannian BFGS method for minimizing a smooth function on a Riemannian manifold is defined, based on a Riemannian generalization of a cautious update and a weak line search condition. It is proven that the Riemannian BFGS method converges (i) globally to stationary points without assuming the objective function to be convex and (ii) superlinearly to a nondegenerate minimizer. Using the weak line search condition allows to completely avoid the information of differentiated retraction. The joint matrix diagonalization problem is chosen to demonstrate the performance of the algorithms with various parameters, line search conditions and pairs of retraction and vector transport. A preliminary version can be found in [HAG16].
Bibliographic reference |
Huang, Wen ; Absil, Pierre-Antoine ; Gallivan, K. A.. A Riemannian BFGS Method Without Differentiated Retraction for Nonconvex Optimization Problems. In: SIAM Journal on Optimization, Vol. 28, no.1, p. 470-495 (2018) |
Permanent URL |
http://hdl.handle.net/2078.1/195965 |