# Introduction to Hidden Markov Models

## We present Markov Chains and the Hidden Markov Model.

Markov Chains

Let us first give a brief introduction to Markov Chains, a type of a random process. We begin with a few “states” for the chain, {S₁,…,Sₖ}; For instance, if our chain represents the daily weather, we can have {Snow,Rain,Sunshine}. The property a process (Xₜ)ₜ should have to be a Markov Chain is:

In words, the probability of begin in a state j depends only on the previous state, and not on what happened before.

Markov Chains are often described by graph with transition probabilities, i.e, the probability of moving to state j from state i, which are denoted by pᵢ,ⱼ. Let’s look at the following example:

The chain has three states; For instance, the tradition probability between Snow and Rain is 0.3, that is — if it was snowing yesterday, there is a 30% chance it will rain today. The transition probabilities can be summarized in a matrix:

Notice that the sum of each row equals 1 (think why). Such a matrix is called a Stochastic Matrix. The (i,j) is defined as pᵢ,ⱼ -the transition probability between i and j.

Fact: if we take a power of the matrix, Pᵏ, the (i,j) entry represents the probability to arrive from state i to state j at k steps.

In many cases we are given a vector of initial probabilities q=(q₁,…,qₖ) to be at each state at time t=0. Thus, the probability to be at state i at time t will be equal to the i-th entry of the vector Pq.

For instance, if today the probabilities of snow, rain and sunshine are 0,0.2,0.8, then the probability it will rain in 100 days is calculated as follows: