List of Topics:
Research Breakthrough Possible @S-Logix pro@slogix.in

Office Address

Social List

Unsupervised domain adaptation of object detectors A survey - 2023

unsupervised-domain-adaptation-of-object-detectors-a-survey.jpg

Research Paper on Unsupervised domain adaptation of object detectors A survey

Research Area:  Machine Learning

Abstract:

Recent advances in deep learning have led to the development of accurate and efficient models for various computer vision applications such as classification, segmentation, and detection. However, learning highly accurate models relies on the availability of large-scale annotated datasets. Due to this, model performance drops drastically when evaluated on label-scarce datasets having visually distinct images, termed as domain adaptation problem. There are a plethora of works to adapt classification and segmentation models to label-scarce target dataset through unsupervised domain adaptation. Considering that detection is a fundamental task in computer vision, many recent works have focused on developing novel domain adaptive detection techniques. Here, we describe in detail the domain adaptation problem for detection and present an extensive survey of the various methods. Furthermore, we highlight strategies proposed and the associated shortcomings. Subsequently, we identify multiple aspects of the problem that are most promising for future research. We believe that this survey shall be valuable to the pattern recognition experts working in the fields of computer vision, biometrics, medical imaging, and autonomous navigation by introducing them to the problem, and familiarizing them with the current status of the progress while providing promising directions for future research.

Keywords:  
Unsupervised domain adaptation
object detectors

Author(s) Name:  Poojan Oza,Vishwanath A. Sindagi,Vibashan Vs,Vishal M. Patel

Journal name:  IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023.

Conferrence name:  

Publisher name:  IEEE

DOI:  10.1109/TPAMI.2022.3217046

Volume Information:  Volume 46,Pages 4018-4040,(2023)