Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

Scalable Mutual Information Estimation Using Dependence Graphs - 2019

Scalable Mutual Information Estimation Using Dependence Graphs

Research Area:  Machine Learning

Abstract:

The Mutual Information (MI) is an often used measure of dependency between two random variables utilized in information theory, statistics and machine learning. Recently several MI estimators have been proposed that can achieve parametric MSE convergence rate. However, most of the previously proposed estimators have high computational complexity of at least O(N 2 ). We propose a unified method for empirical non-parametric estimation of general MI function between random vectors in d based on N i.i.d. samples. The reduced complexity MI estimator, called the ensemble dependency graph estimator (EDGE), combines randomized locality sensitive hashing (LSH), dependency graphs, and ensemble bias-reduction methods. We prove that EDGE achieves optimal computational complexity O(N), and can achieve the optimal parametric MSE rate of O(1/N) if the density is d times differentiable. To the best of our knowledge EDGE is the first non-parametric MI estimator that can achieve parametric MSE rates with linear time complexity. We illustrate the utility of EDGE for the analysis of the information plane (IP) in deep learning. Using EDGE we shed light on a controversy on whether or not the compression property of information bottleneck (IB) in fact holds for ReLu and other rectification functions in deep neural networks (DNN).

Keywords:  

Author(s) Name:  Morteza Noshad; Yu Zeng; Alfred O. Hero

Journal name:  

Conferrence name:  International Conference on Acoustics, Speech and Signal Processing (ICASSP)

Publisher name:  IEEE

DOI:  10.1109/ICASSP.2019.8683351

Volume Information: