Handbook of Convergence Theorems for (Stochastic) Gradient Methods

Bibliographic Details
Title: Handbook of Convergence Theorems for (Stochastic) Gradient Methods
Authors: Garrigos, Guillaume, Gower, Robert M.
Publication Year: 2023
Collection: Mathematics
Subject Terms: Mathematics - Optimization and Control, 65K05, 68T99, G.1.6
More Details: This is a handbook of simple proofs of the convergence of gradient and stochastic gradient descent type methods. We consider functions that are Lipschitz, smooth, convex, strongly convex, and/or Polyak-{\L}ojasiewicz functions. Our focus is on ``good proofs'' that are also simple. Each section can be consulted separately. We start with proofs of gradient descent, then on stochastic variants, including minibatching and momentum. Then move on to nonsmooth problems with the subgradient method, the proximal gradient descent and their stochastic variants. Our focus is on global convergence rates and complexity rates. Some slightly less common proofs found here include that of SGD (Stochastic gradient descent) with a proximal step, with momentum, and with mini-batching without replacement.
Comment: From v2 to v3: Added new sections about SSP (Stochastic Proximal Point) and SPS (Stochastic Polyak Stepsize). Added proof for SGD for nonconvex functions. Simplified some statements for SGD. Corrected various errors and misprints
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2301.11235
Accession Number: edsarx.2301.11235
Database: arXiv
More Details
Description not available.