Publications
Detailed Information
Learning to Forget for Meta-Learning
Cited 50 time in
Web of Science
Cited 63 time in Scopus
- Authors
- Issue Date
- 2020-01
- Publisher
- Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
- Citation
- Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp.2376-2384
- Abstract
- © 2020 IEEE.Few-shot learning is a challenging problem where the goal is to achieve generalization from only few examples. Model-Agnostic meta-learning (MAML) tackles the problem by formulating prior knowledge as a common initialization across tasks, which is then used to quickly adapt to unseen tasks. However, forcibly sharing an initialization can lead to conflicts among tasks and the compromised (undesired by tasks) location on optimization landscape, thereby hindering the task adaptation. Further, we observe that the degree of conflict differs among not only tasks but also layers of a neural network. Thus, we propose task-And-layer-wise attenuation on the compromised initialization to reduce its influence. As the attenuation dynamically controls (or selectively forgets) the influence of prior knowledge for a given task and each layer, we name our method as L2F (Learn to Forget). The experimental results demonstrate that the proposed method provides faster adaptation and greatly improves the performance. Furthermore, L2F can be easily applied and improve other state-of-The-Art MAML-based frameworks, illustrating its simplicity and generalizability.
- ISSN
- 1063-6919
- Files in This Item:
- There are no files associated with this item.
Item View & Download Count
Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.