# Today I Learned

Some of the things I've learned every day since Oct 10, 2016

## 45: Transient and Recurrent States of Markov Chains

A state $S$ of a Markov chain $X$ is transient if, given we start from $S$, there is a non-zero probability that we will never return to $S$. A state is transient iff the expected number of visits to that state is finite.

Conversely, $S$ is recurrent if, given we start from $S$, the probability we will eventually return to $S$ is 1. A state is recurrent iff the expected number of visits to that state is infinite.