Amazing technological breakthrough possible @S-Logix

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • +91- 81240 01111

Social List

Heterogeneous graph attention network for unsupervised multiple-target domain adaptation - 2020

Heterogeneous Graph Attention Network For Unsupervised Multiple-Target Domain Adaptation

Research Area:  Machine Learning


Domain adaptation, which transfers the knowledge from label-rich source domain to unlabeled target domains, is a challenging task in machine learning. The prior domain adaptation methods focus on pairwise adaptation assumption with a single source and a single target domain, while little work concerns the scenario of one source domain and multiple target domains. Applying pairwise adaptation methods to this setting may be suboptimal, as they fail to consider the semantic association among multiple target domains. In this work we propose a deep semantic information propagation approach in the novel context of multiple unlabeled target domains and one labeled source domain. Our model aims to learn a unified subspace common for all domains with a heterogeneous graph attention network, where the transductive ability of the graph attention network can conduct semantic propagation of the related samples among multiple domains. In particular, the attention mechanism is applied to optimize the relationships of multiple domain samples for better semantic transfer. Then, the pseudo labels of the target domains predicted by the graph attention network are utilized to learn domain-invariant representations by aligning labeled source centroid and pseudo-labeled target centroid. We test our approach on four challenging public datasets, and it outperforms several popular domain adaptation methods.


Author(s) Name:  Xu Yang; Cheng Deng; Tongliang Liu; Dacheng Tao

Journal name:   IEEE Transactions on Pattern Analysis and Machine Intelligence ( Early Access )

Conferrence name:  

Publisher name:  IEEE

DOI:  10.1109/TPAMI.2020.3026079

Volume Information:  Page(s): 1 - 1