Probability

From Citizendium
Revision as of 20:20, 14 May 2007 by imported>Catherine Woodgold (A possible (not necessarily probable) definition of probability)
Jump to navigation Jump to search

Probability is a number representing an estimate of how likely an event is, ranging from 1.0 representing certainty down to 0. for impossibility.

Probability is the topic of probability theory, a branch of mathematics concerned with analysis of random phenomena. Like algebra, geometry and other parts of mathematics, probability theory has its origins in the natural world. Humans routinely deal with incomplete and/or uncertain information in daily life: in decisions such as crossing the road ("will this approaching car respect the red light?"), eating food ("am I certain this food is not contaminated?"), and so on. Probability theory is a mathematical tool intended to formalize this ubiquitous mental process. The probability concept is a part of this theory, and is intended to formalize uncertainty.

There are two basic ways to think about the probability concept:

  • Subjective (Bayesian) probability.
  • Objective probability.

The different approaches are largely pedagogical, as some people find one approach or the other much easier.

Bayesian probability

In this approach the probabilities represent a state of knowledge. One starts with a set of propositions and all the available information. Using common sense based formal and/or informal methods one assigns "weights" to the propositions, generating a so called "prior probability distribution". As more information comes in, one combines this "prior" with the new data and obtain a new "posterior" probability distribution, which represents an updated state of knowledge - each probability is a number that describes how much faith one should presently allocate to its associated proposition.

Example of the Bayesian viewpoint

We are given a die, and no information about it. We then set up 6 propositions: "The outcome will be 1", "the outcome will be 2", etc. From the principle of maximum entropy, we assume that all 6 outcomes are equally likely. Our state of knowledge as to the outcome is thus best modeled by putting equal "weights" to all 6 alternative propositions, i.e. 1/6 of our total to each. This is then our prior probability distribution.

We will continually watch the outcomes, and continually redistribute our "weights" according to the results we obtain from throwing the die.

Objective probability

In this approach one views probabilities as "propensities" of the actual system under study - f.i. a fair coin will have a "propensity" to show heads 50% of the time. This approach is more restrictive than the Bayesian interpretation: F.i. there is no way to assign a probability as to whether or not there exists life in the Andromeda galaxy this way, since no "propensities" have been measured.

Example of the objective viewpoint

We are given a die, and a list of it's measured (or theoretical) "propensities", i.e. probabilities for each possible outcome. We then use this information to calculate the "propensities"/probabilities of certain outcomes. If our (empirical) results seem improbable, we may decide to do experiments to re-measure the "propensities".


More technical information


Links


External links