Synaptic metaplasticity in binarized neural networks

Deep neural networks usually rapidly forget the previously learned tasks while training new ones. Laborieux et al. propose a method for training binarized neural networks inspired by neuronal metaplasticity that allows to avoid catastrophic forgetting and is relevant for neuromorphic applications.

Bibliographic Details
Main Authors: Axel Laborieux, Maxence Ernoult, Tifenn Hirtzlin, Damien Querlioz
Format: Article
Language:English
Published: Nature Publishing Group 2021-05-01
Series:Nature Communications
Online Access:https://doi.org/10.1038/s41467-021-22768-y