pyroomacoustics.recognition module

class pyroomacoustics.recognition.CircularGaussianEmission(nstates, odim=1, examples=None)
get_pdfs()

Return the pdf of all the emission probabilities

prob_x_given_state(examples)

Recompute the probability of the observation given the state of the latent variables

update_parameters(examples, gamma)
class pyroomacoustics.recognition.GaussianEmission(nstates, odim=1, examples=None)
get_pdfs()

Return the pdf of all the emission probabilities

prob_x_given_state(examples)

Recompute the probability of the observation given the state of the latent variables

update_parameters(examples, gamma)
class pyroomacoustics.recognition.HMM(nstates, emission, model='full', leftright_jump_max=3)

Hidden Markov Model with Gaussian emissions

K

int – Number of states in the model

O

int – Number of dimensions of the Gaussian emission distribution

A

ndarray – KxK transition matrix of the Markov chain

pi

ndarray – K dim vector of the initial probabilities of the Markov chain

emission

(GaussianEmission or CircularGaussianEmission) – An instance of emission_class

model

string, optional – The model used for the chain, can be ‘full’ or ‘left-right’

leftright_jum_max

int, optional – The number of non-zero upper diagonals in a ‘left-right’ model

backward(X, p_x_given_z, c)

The backward recursion for HMM as described in Bishop Ch. 13

fit(examples, tol=0.1, max_iter=10, verbose=False)

Training of the HMM using the EM algorithm

Parameters:
  • examples ((list)) – A list of examples used to train the model. Each example is an array of feature vectors, each row is a feature vector, the sequence runs on axis 0
  • tol ((float)) – The training stops when the progress between to steps is less than this number (default 0.1)
  • max_iter ((int)) – Alternatively the algorithm stops when a maximum number of iterations is reached (default 10)
  • verbose (bool, optional) – When True, prints extra information about convergence
forward(X, p_x_given_z)

The forward recursion for HMM as described in Bishop Ch. 13

generate(N)

Generate a random sample of length N using the model

loglikelihood(X)

Compute the log-likelihood of a sample vector using the sum-product algorithm

update_parameters(examples, gamma, xhi)

Update the parameters of the Markov Chain

viterbi()