Research Area:  Machine Learning
For better classification, generative models are used to initialize the model and extract features before training a classifier. Typically, separate unsupervised and supervised learning problems are solved. Generative restricted Boltzmann machines and deep belief networks are widely used for unsupervised learning. We developed several supervised models based on deep belief networks in order to improve this two-phase strategy. Modifying the loss function to account for expectation with respect to the underlying generative model, introducing weight bounds, and multi-level programming are all applied in model development. The proposed models capture both unsupervised and supervised objectives effectively. The computational study verifies that our models perform better than the two-phase training approach. In addition, we conduct an ablation study to examine how a different part of our model and a different mix of training samples affect the performance of our models.
Keywords:  
Author(s) Name:  Jaehoon KooDiego KlabjanEmail author
Journal name:  
Conferrence name:   Artificial Neural Networks and Machine Learning – ICANN
Publisher name:  Springer
DOI:  10.1007/978-3-030-61609-0_43
Volume Information:  2020 pp 541-552
Paper Link:   https://link.springer.com/chapter/10.1007/978-3-030-61609-0_43