Polyak's Heavy Ball [1] and Nesterov's Accelerated Gradient [2] are well know examples of momentum methods for optimization. While the latter outperforms the former, solely generalizations of PHB-like methods to nonlinear spaces have been described in the literature. We propose here a generalization of NAG-like methods for Lie group optimization base on the variational one-to-one correspondence between classical and accelerated momentum methods [3]. Numerical experiments and applications to AI posture recognition will be shown.
[1] Boris T. Polyak. Some methods of speeding up the convergence of iterative methods. Ž. Vyčisl. Mat i Mat. Fiz., 4 (1964).
[2] Yu. E. Nesterov. A method for solving the convex programming problem with convergence rate \(O(1/k^2)\). Dokl. Akad. Nauk SSSR, 269 (1983).
[3] Cédric M. Campos, Alejandro Mahillo, and David Martı́n de Diego. A Discrete Variational Derivation of Accelerated Methods in Optimization. arXiv:2106.02700 [math.OC] (2021).