On the Analysis of Trajectories of Gradient Descent in the Optimization of Deep Neural Networks,

Balasubramanian, Vineeth N and Adepu, R S and Srinivasan, Vishwak (2018) On the Analysis of Trajectories of Gradient Descent in the Optimization of Deep Neural Networks,. In: Workshop on Theory of Deep Learning and Workshop on Non-Convex Optimization at International Conference on Machine Learning (ICML), July 2018, Stockhlom,Sweden.

Full text not available from this repository. (Request a copy)


Theoretical analysis of the error landscape of deep neural networks has garnered significant interest in recent years. In this work, we theoretically study the importance of noise in the trajectories of gradient descent towards optimal solutions in multi-layer neural networks. We show that adding noise (in different ways) to a neural network while training increases the rank of the product of weight matrices of a multi-layer linear neural network. We thus study how adding noise can assist reaching a global optimum when the product matrix is full-rank (under certain conditions). We establish theoretical foundations between the noise induced into the neural network - either to the gradient, to the architecture, or to the input/output to a neural network - and the rank of product of weight matrices. We corroborate our theoretical findings with empirical results.

[error in script]
IITH Creators:
IITH CreatorsORCiD
Balasubramanian, Vineeth NUNSPECIFIED
Item Type: Conference or Workshop Item (Paper)
Subjects: Computer science
Divisions: Department of Computer Science & Engineering
Depositing User: Library Staff
Date Deposited: 28 Oct 2019 11:22
Last Modified: 28 Oct 2019 11:23
URI: http://raiith.iith.ac.in/id/eprint/6800
Publisher URL:
Related URLs:

Actions (login required)

View Item View Item
Statistics for RAIITH ePrint 6800 Statistics for this ePrint Item