Current state — Origins — Downloads

The main results achieved in this project are summarized
in Glenn Shafer's and my 2001 book
*Probability and finance: it's only a game*.
This book's web site,
which also contains several later developments,
is the main web site for the project.
This web page describes the origins of the project
and contains some early papers (mostly unpublished).

The origins of this project lie in the algorithmic theory of probability, started by Andrei Kolmogorov and developed, among others, by Per Martin-Löf, Leonid Levin and Claus-Peter Schnorr.

The main notion of the algorithmic theory of probability is that of randomness deficiency. The related notion of randomness was discussed by von Mises in his work on frequentist probability, and formalized by Church in 1940. Von Mises's theory of randomness is intrinsically infinitary: each infinite sequence is classified as either random or non-random, but it does not provide any measure of randomness; in particular, it does not provide any way to detect deviations from randomness for finite sequences. Kolmogorov brought von Mises's theory to the finite world of our experience defining randomness deficiency of a finite binary sequence as the length of this sequence minus its algorithmic complexity. Martin-Löf extended Kolmogorov's definition showing its connection with statistical tests.

In 1939 Jean Ville showed that von Mises's definition of randomness
is unsatisfactory:
some sequences random in the sense of von Mises
do not obey the law of the iterated logarithm.
In his *Etude critique de la notion de collectif*
he proposed another definition,
based on the notion of martingales.
Schnorr extended Kolmogorov's definition in the direction of Ville's martingales.
(An equivalent extension was done by Leonid Levin,
who, however, was not motivated by Ville's ideas.)

When in Autumn 1980 I started my work on the algorithmic theory of probability,
my conclusion from reading the literature was that there were two,
more or less equivalent,
ways to do probability and statistics:
traditional, based on probability,
and the new way, based on randomness deficiency.
For several years I saw the role of randomness deficiency
only in helping the intuition;
the final results could be expressed either way.
This changed after I read Philip Dawid's 1984 and 1985 papers
(in *JRSSA* and *Annals of Statistics*,
respectively)
describing, among other things,
his prequential principle.
Dawid clearly demonstrated limitations of the traditional way.
His work was done in von Mises's infinitary tradition,
but it was easy to combine it with the idea of randomness deficiency
(as defined by Schnorr).
This was done in my 1993 *JRSSB* paper
(read in 1992).

On Steffen Lauritzen's suggestion, in 1995 Glenn Shafer and I started writing a book on the combined Kolmogorov - Schnorr - Dawid approach to the foundations of probability; the work lasted until 2001.

The following papers can be downloaded:

- Game-theoretic versions of Kolmogorov's strong law of large numbers. This is the first of a series of 4 papers prepared in June 1995 for presentation at a workshop at Aalborg University hosted by Lauritzen. This paper gives two versions of the game-theoretic strong law of large numbers: predictive and finance-theoretic.
- Central limit theorem without probability, June 1995. A simple game-theoretic version of the central limit theorem.
- A purely martingale version of Lindeberg's central limit theorem, June 1995. A game-theoretic version of Lindeberg's theorem.
- Pricing European options without probability, June 1995. The idea of Lindeberg's proof of the central limit theorem is applied to option pricing: in the standard derivation of the Black-Scholes formula, the assumption that the stock price is a realization of a diffusion process is replaced by the assumption that a new dividend-paying security is traded whose price path is "substochastic".
- Black-Scholes formula without stochastic assumptions.
This paper develops the idea of the previous paper replacing
the dividend-paying security with a more traditional derivative security
(the
*square*). The square has some important advantages, but also a potentially serious disadvantage: its price depends exponentially on the expected volatility of the underlying security. It was published as a technical report in March 2000 and in the proceedings of UNICOM Seminars, London, May 2000, pp 149 - 154.