Direct Feedback Alignment With Sparse Connections for Local Learning

Recent advances in deep neural networks (DNNs) owe their success to training algorithms that use backpropagation and gradient-descent. Backpropagation, while highly effective on von Neumann architectures, becomes inefficient when scaling to large networks. Commonly referred to as the weight transpor...

Full description

Bibliographic Details
Main Authors: Brian Crafton, Abhinav Parihar, Evan Gebhardt, Arijit Raychowdhury
Format: Article
Language:English
Published: Frontiers Media S.A. 2019-05-01
Series:Frontiers in Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/article/10.3389/fnins.2019.00525/full