Some of the things I've learned every day since Oct 10, 2016
41: Reducible/Irreducible Markov Chains
November 20, 2016Posted by on
A Markov chain is irreducible if any given state of the chain is accessible from every other given state. That is, the chain can always transition to a given state in the future, regardless of the chain’s current state.
Conversely, a Markov chain is reducible if this is not the case: there is at least 1 pair of states where it is impossible to transition in any length of time from to .