pyroomacoustics.recognition module

class pyroomacoustics.recognition.CircularGaussianEmission(nstates, odim=1, examples=None)

Bases: object

get_pdfs()

Return the pdf of all the emission probabilities

prob_x_given_state(examples)

Recompute the probability of the observation given the state of the latent variables

update_parameters(examples, gamma)
class pyroomacoustics.recognition.GaussianEmission(nstates, odim=1, examples=None)

Bases: object

get_pdfs()

Return the pdf of all the emission probabilities

prob_x_given_state(examples)

Recompute the probability of the observation given the state of the latent variables

update_parameters(examples, gamma)
class pyroomacoustics.recognition.HMM(nstates, emission, model='full', leftright_jump_max=3)

Bases: object

Hidden Markov Model with Gaussian emissions

K

Number of states in the model

Type:

int

O

Number of dimensions of the Gaussian emission distribution

Type:

int

A

KxK transition matrix of the Markov chain

Type:

ndarray

pi

K dim vector of the initial probabilities of the Markov chain

Type:

ndarray

emission

An instance of emission_class

Type:

(GaussianEmission or CircularGaussianEmission)

model

The model used for the chain, can be ‘full’ or ‘left-right’

Type:

string, optional

leftright_jum_max

The number of non-zero upper diagonals in a ‘left-right’ model

Type:

int, optional

backward(X, p_x_given_z, c)

The backward recursion for HMM as described in Bishop Ch. 13

fit(examples, tol=0.1, max_iter=10, verbose=False)

Training of the HMM using the EM algorithm

Parameters:
  • examples ((list)) – A list of examples used to train the model. Each example is an array of feature vectors, each row is a feature vector, the sequence runs on axis 0

  • tol ((float)) – The training stops when the progress between to steps is less than this number (default 0.1)

  • max_iter ((int)) – Alternatively the algorithm stops when a maximum number of iterations is reached (default 10)

  • verbose (bool, optional) – When True, prints extra information about convergence

forward(X, p_x_given_z)

The forward recursion for HMM as described in Bishop Ch. 13

generate(N)

Generate a random sample of length N using the model

loglikelihood(X)

Compute the log-likelihood of a sample vector using the sum-product algorithm

update_parameters(examples, gamma, xhi)

Update the parameters of the Markov Chain

viterbi()