Meta-consolidation for continual learning

Joseph, K.J. and Balasubramanian, Vineeth N (2020) Meta-consolidation for continual learning. Advances in Neural Information Processing Systems. ISSN 10495258

Full text not available from this repository. (Request a copy)


The ability to continuously learn and adapt itself to new tasks, without losing grasp of already acquired knowledge is a hallmark of biological learning systems, which current deep learning systems fall short of. In this work, we present a novel methodology for continual learning called MERLIN: Meta-Consolidation for Continual Learning. We assume that weights of a neural network ?, for solving task t, come from a meta-distribution p(?|t). This meta-distribution is learned and consolidated incrementally. We operate in the challenging online continual learning setting, where a data point is seen by the model only once. Our experiments with continual learning benchmarks of MNIST, CIFAR-10, CIFAR-100 and Mini-ImageNet datasets show consistent improvement over five baselines, including a recent state-of-the-art, corroborating the promise of MERLIN.

[error in script]
IITH Creators:
IITH CreatorsORCiD
Balasubramanian, Vineeth NUNSPECIFIED
Item Type: Article
Uncontrolled Keywords: Biological learning; Continual learning; Data points; Novel methodology; Recent state;Learning systems;Deep learning; Image enhancement
Subjects: Computer science
Divisions: Department of Computer Science & Engineering
Depositing User: . LibTrainee 2021
Date Deposited: 31 Jul 2021 11:52
Last Modified: 31 Jul 2021 11:52
Publisher URL:
Related URLs:

Actions (login required)

View Item View Item
Statistics for RAIITH ePrint 8616 Statistics for this ePrint Item