Some of the things I've learned every day since Oct 10, 2016
Category Archives: probability
January 14, 2017Posted by on
A categorical distribution is one which describes a random variable which can take possible values, the probability of each possible outcome being known. For instance, the Bernoulli distribution is a special case of the categorical distribution, where (one outcome being ‘success’, the other ‘failure’). The distribution of the outcomes of rolling a (not necessarily fair) 6-sided die would also be an example, with .
January 11, 2017Posted by on
The standard -simplex, sometimes denoted , is the -simplex in with its vertices at the ‘unit’ points
For example, is the point in , is the line segment from to in , and so on. A formal description is
One possible interpretation of is as the set of possible probability distributions of a categorical distribution on variables.
January 4, 2017Posted by on
The following are some miscellaneous related properties of transition matrices describing stochastic processes such as Markov chains.
A ‘transition matrix’ is a square matrix with real entries in such that the sum of the entries of any given of is equal to 1, the interpretation being that is the probability of a stochastic process transitioning from state to state . A ‘regular’ matrix is a square matrix such that contains strictly positive entries for some .
For any transition matrix:
- Every transition matrix has as an eigenvalue.
For regular transition matrices:
- The algebraic multiplicity of as an eigenvalue of is .
- The columns of are identical, all being the unique -invariant probability vector such that the sum of the entries of is (because it’s a probability vector) and .
- With defined as above, for any probability vector , . That is, the system or process described by eventually converges to a distribution described by , regardless of it’s initial state.
December 17, 2016Posted by on
The transition matrix of a Markov chain is called regular if some positive power of the matrix contains only positive entries.
December 10, 2016Posted by on
The minimum of random variables with corresponding parameters is itself a random variable with parameter .
It is not, however, the case that the maximum is also exponentially distributed.
December 1, 2016Posted by on
A state in a Markov chain is ergodic iff it is aperiodic and recurrent. If all states in a Markov chain are ergodic, then the chain itself is said to be ergodic. This is equivalent to the condition that there is a number such that any state in can be reached by any other state in exactly steps.
November 26, 2016Posted by on
Since a stationary distribution of a finite Markov chain satisfies , where is the transition matrix of , it can be seen as an eigenvector of eigenvalue under the linear transformation by . Specifically, is the intersection of the eigenspace with the hyperplane formed by the constraint that .
(Here the vector space in question is , where is the number of states in .)
November 25, 2016Posted by on
A state in a Markov chain is absorbing iff it is impossible for the chain to leave once it’s there. If an absorbing state is accessible from any arbitrary state in , then is likewise said to be absorbing.
November 24, 2016Posted by on
A state of a Markov chain is transient if, given we start from , there is a non-zero probability that we will never return to . A state is transient iff the expected number of visits to that state is finite.
Conversely, is recurrent if, given we start from , the probability we will eventually return to is 1. A state is recurrent iff the expected number of visits to that state is infinite.
November 22, 2016Posted by on
A state in a Markov chain is aperiodic iff there exists an such that
That is, a state is aperiodic if the chain can always loop back to this state in an arbitrary number of steps.
A Markov chain is aperiodic iff every state in the chain is aperiodic. If the chain is irreducible, then the existence of a single aperiodic state implies that the entire chain is aperiodic.