The main results achieved in this project are summarized in Glenn Shafer's and my 2001 book Probability and finance: it's only a game. This book's web site, which also contains several later developments, is the main web site for the project. This web page describes the origins of the project and contains some early papers (mostly unpublished).
The origins of this project lie in the algorithmic theory of probability, started by Andrei Kolmogorov and developed, among others, by Per Martin-Löf, Leonid Levin and Claus-Peter Schnorr.
The main notion of the algorithmic theory of probability is that of randomness deficiency. The related notion of randomness was discussed by von Mises in his work on frequentist probability, and formalized by Church in 1940. Von Mises's theory of randomness is intrinsically infinitary: each infinite sequence is classified as either random or non-random, but it does not provide any measure of randomness; in particular, it does not provide any way to detect deviations from randomness for finite sequences. Kolmogorov brought von Mises's theory to the finite world of our experience defining randomness deficiency of a finite binary sequence as the length of this sequence minus its algorithmic complexity. Martin-Löf extended Kolmogorov's definition showing its connection with statistical tests.
In 1939 Jean Ville showed that von Mises's definition of randomness is unsatisfactory: some sequences random in the sense of von Mises do not obey the law of the iterated logarithm. In his Etude critique de la notion de collectif he proposed another definition, based on the notion of martingales. Schnorr extended Kolmogorov's definition in the direction of Ville's martingales. (An equivalent extension was done by Leonid Levin, who, however, was not motivated by Ville's ideas.)
When in Autumn 1980 I started my work on the algorithmic theory of probability, my conclusion from reading the literature was that there were two, more or less equivalent, ways to do probability and statistics: traditional, based on probability, and the new way, based on randomness deficiency. For several years I saw the role of randomness deficiency only in helping the intuition; the final results could be expressed either way. This changed after I read Philip Dawid's 1984 and 1985 papers (in JRSSA and Annals of Statistics, respectively) describing, among other things, his prequential principle. Dawid clearly demonstrated limitations of the traditional way. His work was done in von Mises's infinitary tradition, but it was easy to combine it with the idea of randomness deficiency (as defined by Schnorr). This was done in my 1993 JRSSB paper (read in 1992).
On Steffen Lauritzen's suggestion, in 1995 Glenn Shafer and I started writing a book on the combined Kolmogorov - Schnorr - Dawid approach to the foundations of probability; the work lasted until 2001.
The following papers can be downloaded: