List of Topics:
Location Research Breakthrough Possible @S-Logix pro@slogix.in

Office Address

Social List

Semi Supervised Transfer Learning with Hierarchical Self Regularization - 2023

semi-supervised-transfer-learning-with-hierarchical-self-regularization.jpg

Semi Supervised Transfer Learning with Hierarchical Self Regularization | S-Logix

Research Area:  Machine Learning

Abstract:

Both semi-supervised learning and transfer learning aim at lowering the annotation burden for training models. However, such two tasks are usually studied separately, i.e. most semi-supervised learning algorithms train models from scratch while transfer learning assumes pre-trained models as the initialization. In this work, we focus on a previously-less-concerned setting that further reduces the annotation efforts through incorporating both semi-supervised and transfer learning, where specifically a pre-trained source model is used as the initialization of semi-supervised learning. As those powerful pre-trained models are ubiquitously available nowadays and can considerably benefit various down-streaming tasks, such a setting is relevant to real-world applications yet challenging to design effective algorithms. Aiming at enabling transfer learning under semi-supervised settings, we propose a hierarchical self-regularization mechanism to exploit unlabeled samples more effectively, where a novel self-regularizer has been introduced to incorporate both individual-level and population-level regularization terms. The former term employs self-distillation to regularize learned deep features for each individual sample, and the latter one enforces self-consistency on feature distributions between labeled and unlabeled samples. Samples involved in both regularizers are weighted by an adaptive strategy, where self-regularization effects of both terms are adaptively controlled by the confidence of every sample. To validate our algorithm, exhaustive experiments have been conducted on diverse datasets such as CIFAR-10 for general object recognition, CUB-200-2011/MIT-indoor-67 for fine-grained classification and MURA for medical image classification. Compared with state-of-the-art semi-supervised learning methods including Pseudo Label, Mean Teacher, MixMatch and FixMatch, our algorithm demonstrates two advantages: first of all, the proposed approach adopts a new point of view to tackle problems caused by inadequate supervision and achieves very competitive results; then, it is complementary to these state-of-the-art methods and thus can be combined with them to get additional improvements. Furthermore, our method can also be applied to fully supervised transfer learning and self-supervised learning.

Keywords:  
Transfer learning
Unlabeled samples
Regularization
Image classification
Semi-supervised learning

Author(s) Name:  Xingjian Li, Abulikemu Abuduweili, Humphrey Shi

Journal name:  Pattern Recognition

Conferrence name:  

Publisher name:  Elsevier

DOI:  10.1016/j.patcog.2023.109831

Volume Information:  Volume 144