BI-MAML: Balanced Incremental Approach for Meta Learning

Bibliographic Details
Title: BI-MAML: Balanced Incremental Approach for Meta Learning
Authors: Zheng, Yang, Xiang, Jinlin, Su, Kun, Shlizerman, Eli
Publication Year: 2020
Collection: Computer Science
Quantitative Biology
Statistics
Subject Terms: Computer Science - Machine Learning, Computer Science - Computer Vision and Pattern Recognition, Computer Science - Robotics, Quantitative Biology - Neurons and Cognition, Statistics - Machine Learning
More Details: We present a novel Balanced Incremental Model Agnostic Meta Learning system (BI-MAML) for learning multiple tasks. Our method implements a meta-update rule to incrementally adapt its model to new tasks without forgetting old tasks. Such a capability is not possible in current state-of-the-art MAML approaches. These methods effectively adapt to new tasks, however, suffer from 'catastrophic forgetting' phenomena, in which new tasks that are streamed into the model degrade the performance of the model on previously learned tasks. Our system performs the meta-updates with only a few-shots and can successfully accomplish them. Our key idea for achieving this is the design of balanced learning strategy for the baseline model. The strategy sets the baseline model to perform equally well on various tasks and incorporates time efficiency. The balanced learning strategy enables BI-MAML to both outperform other state-of-the-art models in terms of classification accuracy for existing tasks and also accomplish efficient adaption to similar new tasks with less required shots. We evaluate BI-MAML by conducting comparisons on two common benchmark datasets with multiple number of image classification tasks. BI-MAML performance demonstrates advantages in both accuracy and efficiency.
Comment: Please see associated video at: https://youtu.be/4qlb-iG5SFo
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2006.07412
Accession Number: edsarx.2006.07412
Database: arXiv
More Details
Description not available.