Research Area:  Machine Learning
This paper addresses the problem of active task selection which involves selecting the most informative tasks for meta-learning. We propose a novel active task selection criterion based on the mutual information between latent task vectors. Unfortunately, such a criterion scales poorly in the number of candidate tasks when optimized. To resolve this issue, we exploit the submodularity property of our new criterion for devising the first active task selection algorithm for meta-learning with a near-optimal performance guarantee. To further improve our efficiency, we propose an online variant of the Stein variational gradient descent to perform fast belief updates of the meta-parameters via maintaining a set of forward (and backward) particles when learning (or unlearning) from each selected task. We empirically demonstrate the performance of our proposed algorithm on real-world datasets.
Keywords:  
Bayesian Unlearning
Meta-learning
Task selection
Artificial Intelligence
Author(s) Name:  Yizhou Chen, Shizhuo Zhang, Bryan Kian Hsiang Low
Journal name:  
Conferrence name:  Proceedings of The 25th International Conference on Artificial Intelligence and Statistics
Publisher name:  PMLR
DOI:  
Volume Information:  
Paper Link:   https://proceedings.mlr.press/v151/chen22h.html