List of Topics:
Location Research Breakthrough Possible @S-Logix pro@slogix.in

Office Address

Social List

Universal Domain Adaptation Via Compressive Attention Matching - 2023

universal-domain-adaptation-via-compressive-attention-matching.png

Research Paper on Universal Domain Adaptation Via Compressive Attention Matching

Research Area:  Machine Learning

Abstract:

Universal domain adaptation (UniDA) aims to transfer knowledge from the source domain to the target domain without any prior knowledge about the label set. The challenge lies in how to determine whether the target samples belong to common categories. The mainstream methods make judgments based on the sample features, which overemphasizes global information while ignoring the most crucial local objects in the image, resulting in limited accuracy. To address this issue, we propose a Universal Attention Matching (UniAM) framework by exploiting the self-attention mechanism in vision transformer to capture the crucial object information. The proposed framework introduces a novel Compressive Attention Matching (CAM) approach to explore the core information by compressively representing attentions. Furthermore, CAM incorporates a residual-based measurement to determine the sample commonness. By utilizing the measurement, UniAM achieves domain-wise and category-wise Common Feature Alignment (CFA) and Target Class Separation (TCS). Notably, UniAM is the first method utilizing the attention in vision transformer directly to perform classification tasks. Extensive experiments show that UniAM outperforms the current state-of-the-art methods on various benchmark datasets.

Keywords:  

Author(s) Name:  Didi Zhu, Yincuan Li, Junkun Yuan, Zexi Li, Kun Kuang, Chao Wu

Journal name:  Computer Vision and Pattern Recognition

Conferrence name:  

Publisher name:  arXiv

DOI:  10.48550/arXiv.2304.11862

Volume Information:  Volume 5, (2023)