Prob.pylib_dphem#

E_step(ingmm, gmm, opt)#
M_step(gmm, ingmm, Y, opt)#

Update parameters of gmm gmm = M_step(gmm, ingmm, Y, opt)

do_split(ingmm, tagk, mytarget=None)#

gmm = do_split(ingmm,tagk) split ingmm to tagk components

draw_ellipse(position, covariance, ax=None, **kwargs)#

Draw an ellipse with a given position and covariance

gcentroids(point, cluster, weight, cluster_num)#
genergy(point, weight, cluster_center, cluster_weight, cluster)#
gmm_ll(X, gmm)#

gmm_ll – log-likelihood of GMM

USAGE: LL, LLcomp, post = gmm_ll(X, gmm)

INPUTS:

X – 2d array(matrix) with each row the mean of a GMM component gmm – GMM model

OUTPUTS:
LL – log-likelihood of X [n x 1]

= log(gmm(X))

LLcomp – component log-likelihoods [n x K] post – posterior probabilities [n x K]

if bkgndclass is used, LLcomp and post are [n x (K+1)], where the last column is the log-likelihood and posterior in the background class

gmm_plot1d(gmm, color, opt='cpg', dim=0)#

plot a gmm in 1d INPUTS:

opt = ‘c’ – plot component

= ‘p’ – plot priors = ‘g’ – plot gmm

dim = dimension to use [default = 0]

inv_chol(L)#

inv(A) where A = L*L^T with L lower triangular

logComponent(gmm, X, c, d, N)#
logdet_chol(L)#

log(det(A)) where A = L*L^T with L lower triangular

logsumexp(x, axis=0)#
logtrick2(LLcomp)#

LLcomp: N x K, each column is the log-likelihoods of ingmm.mu on a component of gmm = log(a), log(b) … log(k) output a row vector N x 1 = log(sum(a+b+…k))

my_weighted_kmeans(point, weights, init_cluster_center, ncluster, it_max)#

my_weighted_kmeans - perform weighted kmeans clustering on “data” with respect to “weights” assigned on each data point.

INPUT:

data = d x n matrix weights = d x 1 vector init_cluster_center = d x ncluster matrix ncluster = number of clusters required it_max = maximum iteration

OUTPUT:

cluster_label = n x 1 vector, indicating the cluster of each data point cluster_center = d x ncluster matrix

plot_gmm(gmm, X, label=False, ax=None)#
solve_chol(L, b)#

inv(A)*b where A = L*L^T with L lower triangular