Some of the things I've learned every day since Oct 10, 2016
46: Absorbing States of Markov Chains
November 25, 2016Posted by on
A state in a Markov chain is absorbing iff it is impossible for the chain to leave once it’s there. If an absorbing state is accessible from any arbitrary state in , then is likewise said to be absorbing.