Today I Learned

Some of the things I've learned every day since Oct 10, 2016

45: Transient and Recurrent States of Markov Chains

A state S of a Markov chain X is transient if, given we start from S, there is a non-zero probability that we will never return to S. A state is transient iff the expected number of visits to that state is finite.

Conversely, S is recurrent if, given we start from S, the probability we will eventually return to S is 1. A state is recurrent iff the expected number of visits to that state is infinite.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: