Today I Learned

Some of the things I've learned every day since Oct 10, 2016

41: Reducible/Irreducible Markov Chains

A Markov chain is irreducible if any given state of the chain is accessible from every other given state. That is, the chain can always transition to a given state in the future, regardless of the chain’s current state.

Conversely, a Markov chain is reducible if this is not the case: there is at least 1 pair of states $(S, T)$ where it is impossible to transition in any length of time from $S$ to $T$.

Advertisements