# Today I Learned

Some of the things I've learned every day since Oct 10, 2016

## 45: Transient and Recurrent States of Markov Chains

November 24, 2016

Posted by on A state of a Markov chain is **transient **if, given we start from , there is a non-zero probability that we will never return to . A state is transient iff the expected number of visits to that state is finite.

Conversely, is **recurrent **if, given we start from , the probability we will eventually return to is 1. A state is recurrent iff the expected number of visits to that state is infinite.

Advertisements

## Recent Comments