menu_book Explore the article's raw data

Universal Online Convex Optimization With Minimax Optimal Second-Order Dynamic Regret

Abstract

We introduce an online convex optimization algorithm which utilizes projected subgradient descent with optimal adaptive learning rates. We provide a second-order minimax-optimal dynamic regret guarantee (i.e., dependent on the sum of squared subgradient norms) for a sequence of general convex functions, which may not have strong-convexity, smoothness, exp-concavity or even proper Lipschitz-continuity. The guarantee holds against any comparator sequence with bounded path variation (i.e., the sum of distances between successive decisions). We generate a lower bound of the worst-case second-order dynamic regret by incorporating actual subgradient norms. We show that this bound matches our regret guarantee within a constant, making our algorithm minimax-optimal. We also derive the extension for learning in each decision coordinate-block individually. We demonstrate how to best preserve our regret guarantee in a truly online manner, when the bound on the path variation of the comparator grows in time or the feedback regarding that arrives partially as time goes on. We then build on our algorithm to eliminate the need of any knowledge on the comparator path variation, and provide minimax-optimal second-order regret guarantees with no a priori information. Our approach can compete against all comparator sequences simultaneously (i.e., universally) in a minimax-optimal manner, i.e., each regret guarantee depends on the respective comparator path variation. We discuss modifications that address complexity reductions for time, computation and memory. We further improve our results by making the regret guarantees also dependent on the comparator sets' minimum-bounding diameters, in addition to the respective path variations.

article Article
date_range 2024
language English
link Link of the paper
format_quote
Sorry! There is no raw data available for this article.
Loading references...
Loading citations...
Featured Keywords

Heuristic algorithms
Optimization
Convex functions
Vectors
Programming
Complexity theory
Upper bound
Convex optimization
dynamic regret
gradient descent
minimax optimal
online learning
universal guarantee
Citations by Year

Share Your Research Data, Enhance Academic Impact