Cedar Site Bai \Emailbai123@purdue.edu
\addrDepartment of Computer Science
Purdue University
and \NameBrian Bullins \Emailbbullins@purdue.edu
\addrDepartment of Computer Science
Purdue University
Faster Acceleration for Steepest Descent
Abstract
Recent advances (Sherman2017Area; sidford2018coordinate; cohen2021relative) have overcome the fundamental barrier of dimension dependence in the iteration complexity of solving regression with first-order methods. Yet it remains unclear to what extent such acceleration can be achieved for general smooth functions. In this paper, we propose a new accelerated first-order method for convex optimization under non-Euclidean smoothness assumptions. In contrast to standard acceleration techniques, our approach uses primal-dual iterate sequences taken with respect to differing norms, which are then coupled using an implicitly determined interpolation parameter. For norm smooth problems in dimensions, our method provides an iteration complexity improvement of up to in terms of calls to a first-order oracle, thereby allowing us to circumvent long-standing barriers in accelerated non-Euclidean steepest descent.
keywords:
First-order acceleration, convex optimization, non-Euclidean smoothness, steepest descent1 Introduction
Large-scale optimization tasks are a central part of modern machine learning, and many of the algorithms that find success in training these models, such as SGD