Today I Learned

Some of the things I've learned every day since Oct 10, 2016

Category Archives: computation

98: Top Type, Bottom Type

Given a type system, the top type \top is the type which contains every possible value in that type system, making every other type a subtype of \top.

By contrast, the bottom type \bot is the type which contains no values, making every other type a supertype of \bot.

94: The Liskov Substitution Principle

The Liskov Substitution Principle makes the intuitive generalization that if A, B are types and A <: B (A is a subtype of B), then in a program with this type system, objects from B can be substituted with objects from A without certain properties of the program being affected.

67: Confluence (Rewriting Systems)

A term a within a rewriting system (such as one in the lambda calculus) is confluent if, for any pair of terms b, c such that a \rightarrow b and a \rightarrow c (where x \rightarrow y means y is a reduct of x under 0 or more reductions), there exists a term d such that b \rightarrow d and c \rightarrow d. If every term in a system is confluent, the system itself is said to be confluent.

Systems which are both confluent and terminating are especially nice, because any sequence of reductions can be applied to a given term with the resulting irreducible reduct being the same.

62: Homoiconicity

In programming, homoiconicity a property of languages in which programs are represented as instances of a data structure of a primitive type of the language. This takes advantage of the fact that the information of programs and data can be carried through the same medium by storing a program’s code in a form that the language knows how to access and manipulate.

This can make metaprogramming much easier than it otherwise would be. A good example of a language exhibiting homoiconicity is LISP, where everything is represented as a list (S-expression). Since both programs and data are stored as lists, a LISP program can manipulate another program as easily as it would any data.

54: Newman’s Lemma

Newman’s Lemma, also known as the Diamond Lemma, states that if a rewriting system is terminating (is strongly normalizing), then that system is locally confluent if and only if it is confluent.

51: Undecidability of Equality of Lambda Expressions

In general, there is no algorithm which determines whether two lambda expressions are equivalent under the reduction rules of the lambda calculus. This is the problem for which the first proof of undecidability was given.

50: Normalization Property (Lambda Calculus)

In a rewrite system, such as \beta-reduction in the lambda calculus, a term may or may not satisfy either the  weak normalization or the strong normalization property.

A term T satisfies the weak normalization property if there exists a sequence of rewrites on T that eventually results in a normal form of T: a term which is irreducible.

A term T satisfies the strong normalization property if every sequence of rewrites on T eventually results in a normal form of T.

If every term in a rewrite system satisfies the [weak | strong] normalization property, then the system itself is said to satisfy the same property.


An example of a system which satisfies neither of the normalization properties is the pure untyped lambda calculus. An example of a non-normal term within this system is

(\lambda y . y y) (\lambda y . y y).

Under \beta-reduction, this term reduces to itself, and so it never terminates in an irreducible form.

Conversely, a term within this system which is weakly normal is

(\lambda x . x) x.

Under \beta-reduction, this term reduces to just the variable x, an irreducible term.

There are other forms of the lambda calculus, as well as other similar systems, which are normal. These can be viewed as programming languages in which every program eventually will (or can) terminate. A significant drawback to such a system, however, is that if a system is normal it cannot be Turing complete.

Additionally, in the lambda calculus every normalizing term has a unique normal form.