Layer Dynamics of Linearised Neural Nets

Bibliographic Details
Title: Layer Dynamics of Linearised Neural Nets
Authors: Basu, Saurav, Mukherjee, Koyel, Vasudevan, Shrihari
Publication Year: 2019
Collection: Computer Science
Statistics
Subject Terms: Computer Science - Machine Learning, Statistics - Machine Learning
More Details: Despite the phenomenal success of deep learning in recent years, there remains a gap in understanding the fundamental mechanics of neural nets. More research is focussed on handcrafting complex and larger networks, and the design decisions are often ad-hoc and based on intuition. Some recent research has aimed to demystify the learning dynamics in neural nets by attempting to build a theory from first principles, such as characterising the non-linear dynamics of specialised \textit{linear} deep neural nets (such as orthogonal networks). In this work, we expand and derive properties of learning dynamics respected by general multi-layer linear neural nets. Although an over-parameterisation of a single layer linear network, linear multi-layer neural nets offer interesting insights that explain how learning dynamics proceed in small pockets of the data space. We show in particular that multiple layers in linear nets grow at approximately the same rate, and there are distinct phases of learning with markedly different layer growth. We then apply a linearisation process to a general RelU neural net and show how nonlinearity breaks down the growth symmetry observed in liner neural nets. Overall, our work can be viewed as an initial step in building a theory for understanding the effect of layer design on the learning dynamics from first principles.
Document Type: Working Paper
Access URL: http://arxiv.org/abs/1904.10689
Accession Number: edsarx.1904.10689
Database: arXiv
More Details
Description not available.