Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

Deep Semantic Similarity Adversarial Hashing for Cross-Modal Retrieval - 2020

Deep Semantic Similarity Adversarial Hashing For Cross-Modal Retrieval

Research Area:  Machine Learning

Abstract:

Cross-modal retrieval has attracted considerate attention due to the rapid development of Internet and social media, and cross-modal hashing has been widely and successfully used in this domain. However, most existing hashing methods consider much little the semantic similarity levels between instances, whereas simply classified the semantic relationship as either similar or dissimilar. Besides, the issue of preservation of semantic similarity of original data between the extracted features is less explored from the existing methods. Due to the heterogeneity between different modalities, similarity of different modality features cannot be calculated directly. Therefore, in this paper, we propose a deep semantic similarity adversarial hashing (DSSAH) for cross-modal retrieval. We first calculate semantic similarity by using both label and feature information to provide a more accurate value for similarity between instances. And then an adversarial modality discriminator is introduced to establish a common feature space where similarity of each modality features can be calculated. Finally, two loss functions referring to inter-modal loss and intra-modal loss are designed to generate high quality hash codes. Experiments on three common datasets for cross-modal retrieval show that DSSAH outperforms state-of-the-art cross-modal hashing methods in cross-modal retrieval applications.

Keywords:  

Author(s) Name:  Haopeng Qiang, Yuan Wan, Lun Xiang, Xiaojing Meng

Journal name:  Neurocomputing

Conferrence name:  

Publisher name:  Elsevier

DOI:  10.1016/j.neucom.2020.03.032

Volume Information:  Volume 400, 4 August 2020, Pages 24-33