This function initialises a general discrete time and discrete space Hidden Markov Model (HMM). A HMM consists of an alphabet of states and emission symbols. A HMM assumes that the states are hidden from the observer, while only the emissions of the states are observable. The HMM is designed to make inference on the states through the observation of emissions. The stochastics of the HMM is fully described by the initial starting probabilities of the states, the transition probabilities between states and the emission probabilities of the states.
initHMM(States, Symbols, startProbs=NULL, transProbs=NULL, emissionProbs=NULL)
Vector with the names of the states.
Vector with the names of the symbols.
Vector with the starting probabilities of the states.
Stochastic matrix containing the transition probabilities between the states.
Stochastic matrix containing the emission probabilities of the states.
The function initHMM
returns a HMM that consists of a list of 5 elements:
Vector with the names of the states.
Vector with the names of the symbols.
Annotated vector with the starting probabilities of the states.
Annotated matrix containing the transition probabilities between the states.
Annotated matrix containing the emission probabilities of the states.
Dimension and Format of the Arguments.
Vector of strings.
Vector of strings.
Vector with the starting probabilities of the states. The entries must sum to 1.
transProbs
is a (number of states)x(number of states)-sized
matrix, which contains the transition probabilities between states.
The entry transProbs[X,Y]
gives the probability of a transition
from state X
to state Y
. The rows of the matrix must sum to 1.
emissionProbs
is a (number of states)x(number of states)-sized
matrix, which contains the emission probabilities of the states.
The entry emissionProbs[X,e]
gives the probability of emission
e
from state X
. The rows of the matrix must sum to 1.
In transProbs
and emissionProbs
NA's can be used in order to forbid
specific transitions and emissions. This might be useful for Viterbi training or
the Baum-Welch algorithm when using pseudocounts.
For an introduction in the HMM-literature see for example:
Lawrence R. Rabiner: A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition. Proceedings of the IEEE 77(2) p.257-286, 1989.
Olivier Cappe, Eric Moulines, Tobias Ryden: Inference in Hidden Markov Models. Springer. ISBN 0-387-40264-0.
Ephraim Y., Merhav N.: Hidden Markov processes. IEEE Trans. Inform. Theory 48 p.1518-1569, 2002.
See simHMM
to simulate a path of states and observations from a Hidden Markov Model.
# NOT RUN {
# Initialise HMM nr.1
initHMM(c("X","Y"), c("a","b","c"))
# Initialise HMM nr.2
initHMM(c("X","Y"), c("a","b"), c(.3,.7), matrix(c(.9,.1,.1,.9),2),
matrix(c(.3,.7,.7,.3),2))
# }
Run the code above in your browser using DataLab