I downloaded this GitHub and ran the notebooks: https://github.com/tinrabuzin/MVBPCA. The Jupyter notebooks in the GitHub show that this technique is efficient at inferring the parameters of multivariate Gaussian distributions (mean and covariance matrix). To a first approximation, everything we look at should be a multivariate Gaussian distribution. So thinking in those terms, it's a pretty good tool. To run then you need to have sklearn and matplotlib installed: ``` conda install scikit-learn conda install matplotlib ``` The notebooks refer to a deprecated function in Matlab. This function can be reproduced as follows: ``` import numpy as np import scipy.stats def mlab_bivariate_normal(x,y,sigma1,sigma2,mu1,mu2,off_cov): mean_pre=np.array([mu1,mu2]) cov_pre=np.array([[sigma1**2,off_cov],[off_cov,sigma2**2]]) rv=scipy.stats.multivariate_normal(mean_pre, cov_pre) pos = np.empty(x.shape + (2,)) pos[:, :, 0] = x; pos[:, :, 1] = y p=rv.pdf(pos) return p ```

Created by Lars Ericson lars.ericson

Variational Inference for Mixtures of Probabilistic PCAs page is loading…