Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

Multimodal Generative Learning Utilizing Jensen-Shannon-Divergence - 2020

Multimodal Generative Learning Utilizing Jensen-Shannon-Divergence

Research paper on Multimodal Generative Learning Utilizing Jensen-Shannon-Divergence

Research Area:  Machine Learning

Abstract:

Learning from different data types is a long-standing goal in machine learning research, as multiple information sources co-occur when describing natural phenomena. However, existing generative models that approximate a multimodal ELBO rely on difficult or inefficient training schemes to learn a joint distribution and the dependencies between modalities. In this work, we propose a novel, efficient objective function that utilizes the Jensen-Shannon divergence for multiple distributions. It simultaneously approximates the unimodal and joint multimodal posteriors directly via a dynamic prior. In addition, we theoretically prove that the new multimodal JS-divergence (mmJSD) objective optimizes an ELBO. In extensive experiments, we demonstrate the advantage of the proposed mmJSD model compared to previous work in unsupervised, generative learning tasks.

Keywords:  
Multimodal
Generative Learning
JS-divergence
unsupervised
Machine learning

Author(s) Name:  Thomas M. Sutter, Imant Daunhawer, Julia E. Vogt

Journal name:  Machine Learning

Conferrence name:  

Publisher name:  arXiv:2006.08242

DOI:  10.48550/arXiv.2006.08242

Volume Information: