Learning to backpropagate
No Thumbnail Available
Date
2020
Authors
Eyono, Roy Henha
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
The backpropagation algorithm is regarded as the de-facto standard for
gradient optimization in artificial neural networks. Since the conception of
the method, modifications of the algorithm have been proposed. Adaptive
learning rate methods are examples of custom modification to the back propagation equation as they scale the gradient equation during training.
Existing gradient modifications are largely based on theoretical fundamen tals of gradient optimization, but few methods optimize for modifications
that are based on data. We are motivated by the idea of discovering better
custom gradient update equations for gradient optimization.
In this paper, we present our parametrized backpropagation learning frame work (PBLF) which learns modifications of the backpropagation gradient
update equation for stochastic gradient optimization. We achieve this by
optimizing parts of the backpropagation equation to produce custom gra dient update equations for gradient optimization. We evaluate our custom
equations by training our target network on a validation dataset. In our
dissertation, we provide empirical analysis and evidence to support PBLF
as a competitive alternative to standard backpropagation.
In our experiments, we report competitive empirical performances on CI FAR10 with our custom gradient update equations sampled from PBLF.
Our data-driven method offers promising custom update equations for
gradient optimization
Description
A dissertation submitted in fulfillment of the requirements for the degree of
Master of Science to the Faculty of Science, University of the
Witwatersrand, Johannesburg, 2020
Keywords
Citation
Eyono Henha, Roy Pavel Samuel (2020) Learning to backpropagate, University of the Witwatersrand, Johannesburg, <http://hdl.handle.net/10539/31427>