In this talk we will investigate different conceptual understandings of what objective chance is. As we shall see, these are grounded in the differing interpretations of probability laws.
According to Carnap (1945), chance is a non-epistemic, objective kind of probability. It contrasts to credence, which – in Carnap’s layout – is an epistemic evidential probability, measuring the degree of belief.
However, there is up to date no consensus on what chance actually is: both deterministic and indeterministic theories have been suggested to its account.
We will review some of the seminal approaches to objective probability, which can be grouped into the following two categories:
1. Frequentist interpretations, and
2. Propensity interpretations.
Frequentism interprets chance as the relative frequency of a random outcome within a given set.
Propensity theorists like Popper (1957) interpret probabilities as physical propensities: chance, then, is the tendency of a type of physical situation to realize itself.
We will look at important varieties of these (finite actual frequentism, hypothetical frequentism and Humeanism; long-run propensity and single-case propensity) and argue that none of them is immune to considerable objections.
Instead, we will defend a computational interpretation of chance on the basis of Kolmogorov complexity - a measure of algorithmic compressibility of information. On this view, a probability law is the shortest stochastic algorithm characterizing a random process, and chance expresses the indeterminacy of the succession of events in random sequence.