Articles | Volume 16, issue 1
https://doi.org/10.5194/npg-16-57-2009
https://doi.org/10.5194/npg-16-57-2009
06 Feb 2009
 | 06 Feb 2009

Information theoretic measures of dependence, compactness, and non-gaussianity for multivariate probability distributions

A. H. Monahan and T. DelSole

Abstract. A basic task of exploratory data analysis is the characterisation of "structure" in multivariate datasets. For bivariate Gaussian distributions, natural measures of dependence (the predictive relationship between individual variables) and compactness (the degree of concentration of the probability density function (pdf) around a low-dimensional axis) are respectively provided by ordinary least-squares regression and Principal Component Analysis. This study considers general measures of structure for non-Gaussian distributions and demonstrates that these can be defined in terms of the information theoretic "distance" (as measured by relative entropy) between the given pdf and an appropriate "unstructured" pdf. The measure of dependence, mutual information, is well-known; it is shown that this is not a useful measure of compactness because it is not invariant under an orthogonal rotation of the variables. An appropriate rotationally invariant compactness measure is defined and shown to reduce to the equivalent PCA measure for bivariate Gaussian distributions. This compactness measure is shown to be naturally related to a standard information theoretic measure of non-Gaussianity. Finally, straightforward geometric interpretations of each of these measures in terms of "effective volume" of the pdf are presented.