Amazing technological breakthrough possible @S-Logix

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • +91- 81240 01111

Social List

Self-supervised Learning:Generative or Contrastive - 2021

Research Area:  Machine Learning


Deep supervised learning has achieved great success in the last decade. However, its defects of heavy dependence on manual labels and vulnerability to attacks have driven people to find other paradigms. As an alternative, self-supervised learning (SSL) attracts many researchers for its soaring performance on representation learning in the last several years. Self-supervised representation learning leverages input data itself as supervision and benefits almost all types of downstream tasks. In this survey, we take a look into new self-supervised learning methods for representation in computer vision, natural language processing, and graph learning. We comprehensively review the existing empirical methods and summarize them into three main categories according to their objectives: generative, contrastive, and generative-contrastive (adversarial). We further collect related theoretical analyses on self-supervised learning to provide deeper thoughts on why self-supervised learning works. Finally, we briefly discuss open problems and future directions for self-supervised learning. An outline slide for the survey is provided.

Author(s) Name:  Xiao Liu; Fanjin Zhang; Zhenyu Hou; Li Mian; Zhaoyu Wang; Jing Zhang; Jie Tang

Journal name:  IEEE Transactions on Knowledge and Data Engineering ( Early Access )

Conferrence name:  

Publisher name:  IEEE

DOI:  10.1109/TKDE.2021.3090866

Volume Information:   Page(s): 1 - 1