The standard definition of a probability on a set A is as a sigma-additive function from a sigma-algebra constructed on the power set of A to the extended real positive line (also called a measure), such that the measure of the whole set A is 1.
If A is the set of real numbers, such a definition cannot be made in such a way that any set consisting of only one natural number has equal probability, for, if mu({n}) = c > 0 for any n natural (where mu is the measure), then any infinite set would have infinite measure, and, in particular, mu(A) != 1.
But it is trully natural to think, for example, about the probability of certain events relating to the choice of a random natural number, such as the probability of getting an even number out of a random choice of a natural number, which would be 1/2. This kind of probability should satisfy the above condition of having odds 0 of getting any particular natural number.
I searched the forum and saw that many people work with probability on the natural numbers by relaxing the assumption of the probability being sigma-additive to just being additive. This seems to solve the problem, although I still didn't find an algebra and a probability function which would seem satisfying (but I guess it is not so hard).
The question, then, is what exactly one misses when relaxing the assumption of sigma-additivity. Does a definition of probability in the subsets of natural numbers via a finitely additive set function lead to inconsistencies? What theorems about probability stop applying under such a definition?
Also, am I being to restrictive in my general definition of probability, and is it acceptable in modern mathematical culture to define probabilities via only finitely aditive functions?
No comments:
Post a Comment