Today I Learned

Some of the things I've learned every day since Oct 10, 2016

41: Reducible/Irreducible Markov Chains

A Markov chain is irreducible if any given state of the chain is accessible from every other given state. That is, the chain can always transition to a given state in the future, regardless of the chain’s current state.

Conversely, a Markov chain is reducible if this is not the case: there is at least 1 pair of states (S, T) where it is impossible to transition in any length of time from S to T.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: