# Today I Learned

Some of the things I've learned every day since Oct 10, 2016

## Category Archives: probability

## 91: Categorical Distribution (Probability)

January 14, 2017

Posted by on A **categorical distribution** is one which describes a random variable which can take possible values, the probability of each possible outcome being known. For instance, the Bernoulli distribution is a special case of the categorical distribution, where (one outcome being ‘success’, the other ‘failure’). The distribution of the outcomes of rolling a (not necessarily fair) 6-sided die would also be an example, with .

## 90: The Standard (Unit) Simplex

January 11, 2017

Posted by on The **standard -simplex**, sometimes** **denoted , is the -simplex in with its vertices at the ‘unit’ points

.

For example, is the point in , is the line segment from to in , and so on. A formal description is

.

One possible interpretation of is as the set of possible probability distributions of a categorical distribution on variables.

## 84: Some Properties of Transition Matrices (Probability)

January 4, 2017

Posted by on The following are some miscellaneous related properties of transition matrices describing stochastic processes such as Markov chains.

A ‘transition matrix’ is a square matrix with real entries in such that the sum of the entries of any given of is equal to 1, the interpretation being that is the probability of a stochastic process transitioning from state to state . A ‘regular’ matrix is a square matrix such that contains strictly positive entries for some .

### For any transition matrix:

- Every transition matrix has as an eigenvalue.

### For regular transition matrices:

- The algebraic multiplicity of as an eigenvalue of is .
- The columns of are identical, all being the unique -invariant probability vector such that the sum of the entries of is (because it’s a probability vector) and .
- With defined as above, for any probability vector , . That is, the system or process described by eventually converges to a distribution described by , regardless of it’s initial state.

## 66: Regular Transition Matrices

December 17, 2016

Posted by on The transition matrix of a Markov chain is called **regular** if some positive power of the matrix contains only positive entries.

## 60: Closure of MIN on Exponential Random Variables

December 10, 2016

Posted by on The minimum of random variables with corresponding parameters is itself a random variable with parameter .

It is not, however, the case that the maximum is also exponentially distributed.

## 52: Ergodicity in Markov Chains

December 1, 2016

Posted by on A state in a Markov chain is **ergodic **iff it is aperiodic and recurrent. If all states in a Markov chain are ergodic, then the chain itself is said to be ergodic. This is equivalent to the condition that there is a number such that any state in can be reached by any other state in exactly steps.

## 47: Invariant Distributions as Eigenvectors

November 26, 2016

Posted by on Since a stationary distribution of a finite Markov chain satisfies , where is the transition matrix of , it can be seen as an eigenvector of eigenvalue under the linear transformation by . Specifically, is the intersection of the eigenspace with the hyperplane formed by the constraint that .

(Here the vector space in question is , where is the number of states in .)

## 46: Absorbing States of Markov Chains

November 25, 2016

Posted by on A state in a Markov chain is **absorbing** iff it is impossible for the chain to leave once it’s there. If an absorbing state is accessible from any arbitrary state in , then is likewise said to be absorbing.

## 45: Transient and Recurrent States of Markov Chains

November 24, 2016

Posted by on A state of a Markov chain is **transient **if, given we start from , there is a non-zero probability that we will never return to . A state is transient iff the expected number of visits to that state is finite.

Conversely, is **recurrent **if, given we start from , the probability we will eventually return to is 1. A state is recurrent iff the expected number of visits to that state is infinite.

## 43: Aperiodicity of Markov Chains

November 22, 2016

Posted by on A state in a Markov chain is **aperiodic **iff there exists an such that

.

That is, a state is aperiodic if the chain can always loop back to this state in an arbitrary number of steps.

A Markov chain is aperiodic iff every state in the chain is aperiodic. If the chain is irreducible, then the existence of a single aperiodic state implies that the entire chain is aperiodic.

## Recent Comments