Randomized Forward Mode of Automatic Differentiation For Optimization Algorithms

Bibliographic Details
Title: Randomized Forward Mode of Automatic Differentiation For Optimization Algorithms
Authors: Shukla, Khemraj, Shin, Yeonjong
Publication Year: 2023
Collection: Computer Science
Mathematics
Subject Terms: Mathematics - Optimization and Control, Computer Science - Artificial Intelligence, Computer Science - Machine Learning, 65K05, 65B99, 65Y20
More Details: We present a randomized forward mode gradient (RFG) as an alternative to backpropagation. RFG is a random estimator for the gradient that is constructed based on the directional derivative along a random vector. The forward mode automatic differentiation (AD) provides an efficient computation of RFG. The probability distribution of the random vector determines the statistical properties of RFG. Through the second moment analysis, we found that the distribution with the smallest kurtosis yields the smallest expected relative squared error. By replacing gradient with RFG, a class of RFG-based optimization algorithms is obtained. By focusing on gradient descent (GD) and Polyak's heavy ball (PHB) methods, we present a convergence analysis of RFG-based optimization algorithms for quadratic functions. Computational experiments are presented to demonstrate the performance of the proposed algorithms and verify the theoretical findings.
Comment: 22 Pages, 7 Figures
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2310.14168
Accession Number: edsarx.2310.14168
Database: arXiv
More Details
Description not available.