Research Area:  Machine Learning
As we embark on the big data era, numerous unlabeled data has generated increasingly, and thus deep clustering analysis has become prevalent in artificial intelligence. Yet, most existing deep clustering methods still have the following demerits: 1) They fail to take cluster-specificity distribution into account, resulting in suboptimal latent representation. 2) They suffer from the scale issue of the distributions between the given sample and cluster centers, resulting in unstable clustering performance. 3) They fail to utilize the obtained clustering labels, resulting in suboptimal clustering performances. To fill these gaps, we propose a new deep clustering solution, namely Adversarial Self-supervised Clustering With Cluster-specificity Distribution (ASC2D). Specifically, by imposing the cluster-specificity constraint, which is measured by the -norm, the learned latent representation well encodes the cluster structure. Meanwhile, by introducing the thought of adversarial learning, ASC2D well eliminates the gaps between distributions. Moreover, ASC2D utilize the clustering label to supervise the learning of representation, where the latter is used in turn to conduct the subsequent clustering. By this way, clustering and representation learning are seamlessly connected, with the aim to achieve better clustering performance. Extensive experimental results show that ASC2D is superior to 14 state-of-the-art baselines on six image datasets in terms of three evaluation metrics, especially on Fashion-MNIST datasets, ASC2D brings about and improvement over the best baseline in terms of ACC and NMI metrics.
Keywords:  
Unlabeled data
Clustering analysis
Artificial Intelligence
Representation learning
Evaluation metrics
Author(s) Name:  Wei Xia,Xiangdong Zhang,Quanxue Gao
Journal name:  Neurocomputing
Conferrence name:  
Publisher name:  Elsevier
DOI:  10.1016/j.neucom.2021.03.108
Volume Information:  Volume 449
Paper Link:   https://www.sciencedirect.com/science/article/abs/pii/S0925231221005002