Simplified derivations for high-dimensional convex learning problems

Bibliographic Details
Title: Simplified derivations for high-dimensional convex learning problems
Authors: Clark, David G., Sompolinsky, Haim
Publication Year: 2024
Collection: Computer Science
Condensed Matter
Quantitative Biology
Subject Terms: Condensed Matter - Disordered Systems and Neural Networks, Computer Science - Neural and Evolutionary Computing, Quantitative Biology - Neurons and Cognition
More Details: Statistical-physics calculations in machine learning and theoretical neuroscience often involve lengthy derivations that obscure physical interpretation. We present concise, non-replica derivations of key results and highlight their underlying similarities. Using a cavity approach, we analyze high-dimensional learning problems: perceptron classification of points and manifolds, and kernel ridge regression. These problems share a common structure--a bipartite system of interacting feature and datum variables--enabling a unified analysis. For perceptron-capacity problems, we identify a symmetry that allows derivation of correct capacities through a na\"ive method.
Comment: Submission to SciPost; 28 pages, 1 figure; fixed typos, added references
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2412.01110
Accession Number: edsarx.2412.01110
Database: arXiv
More Details
Description not available.