This is a wiki by Tobias Fritz and Paolo Perrone.
Here we work on probability, information theory, and networks from a conceptual and categorical point of view.
Mathematics can be seen as an abstraction of everyday thinking. Most mathematical structures consist in fact of equivalence classes of more “practical” objects, implicitly or explicitly.
For example, the idea of “natural numbers” is simply an abstraction of isomorphism classes of finite sets. If we have a set of three apples and a set of three oranges, we can form a bijection between them. This is not possible with, say, three apples and two oranges. “Admitting a bijection” is an equivalence relation, and number 3 is a symbol that identifies the equivalence class represented by “three apples” and “three oranges”. Not just the natural numbers arise this way, but also the basic arithmetical operations between them: addition comes from disjoint union, multiplication from Cartesian product, and powers from sets of functions.
When we write, for example, , we are writing an abstraction of the idea: “The disjoint union of a set of two elements (in the equivalence class “2”) and a set of one element (in the equivalence class “1”) is isomorphic to a set of three elements (in the equivalence class “3”)“. In other words, an equation replaces an isomorphism relation.
This abstraction process is called “decategorification“, and it pervades mathematics, as it is often extremely useful. Mathematicians are usually not interested in whether the set contains apples or oranges, but only care about its isomorphism class, the “number”, or “cardinality”.
However, in the decategorification, an important conceptual datum is lost: which isomorphism. Two objects may have many different isomorphisms between them, and this may be of mathematical interest.
For example, a finite set is isomorphic to itself in different ways, and these bijections form a group, the permutation group. Similarly, an equilateral triangle is symmetric under rotations of 120 degrees. Knowing which isomorphisms an objects has with itself (called automorphisms) is crucial whenever one wants to talk about symmetry. The whole field of group theory can be seen as the study of “the ways in which objects can be isomorphic to themselves”.
The process of “inverting” decategorification, going back from equalities to isomorphisms, is called categorification. Since decategorification loses information, in general a mathematical structure admits many possible categorifications, all perfectly valid, and the choice of one particular categorification depends on what one wants to model.
For example, natural numbers are isomorphism classes of finite sets (as cardinalities), but also isomorphism classes of finite-dimensional vector spaces (as dimensions). The former is a valid categorification to model apples and oranges, while the latter is more useful to model, for example, degrees of freedom.
Are probability and information theory decategorifications of interesting concepts? What are their properties and relations? And most of all, can a categorification give useful insights to ordinary (decategorified) probability and information theory? In particular, can a categorical approach be used to describe and study higher-order statistical interactions?
There are a number of works on applications of category theory to probability and information theory. A (necessarily incomplete) list is:
- M. Giry, A categorical approach to probability theory. Categorical aspects of topology and analysis, pp. 68–85, Lecture Notes in Math. 915 Springer 1982.
- J. C. Baez, T. Fritz, and T. Leinster. A characterization of entropy in terms of information loss. Entropy, 13(11):1945–1957, 2011. Available here.
- B. Fong. Causal theories: A categorical perspective on Bayesian networks. Master’s thesis, University of Oxford, 2012. Available here.
- M. Gromov. In a search for a structure, part 1: On entropy, 2012. Available here.
- J. C. Baez and T. Fritz, A Bayesian characterization of relative entropy. Theory and Applications of Categories, 29(16):421–456, 2014. Available here.