# Bayes Theorem

(Redirected from Bayes' theorem)

Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
Citable Version  [?]

This editable Main Article is under development and subject to a disclaimer.

Bayes' Theorem is a theorem in probability theory named for Thomas Bayes (1702–1761).

It is used for updating probabilities by finding conditional probabilities given new data. This simplest case involves a situation in which probabilities have been assigned to each of several mutually exclusive alternatives H1, ..., Hn, at least one of which may be true. New data D is observed. The conditional probability of D given each of the alternative hypotheses H1, ..., Hn is known. What is needed is the conditional probability of each hypothesis Hi given D. Bayes' Theorem says

${\displaystyle P(H_{i}\mid D)={\frac {P(H_{i})P(D\mid H_{i})}{P(H_{1})P(D\mid H_{1})+\cdots +P(H_{n})P(D\mid H_{n})}}.}$

The use of Bayes' Theorem is sometimes described as follows. Start with the vector of "prior probabilities", i.e. the probabilities of the several hypotheses before the new data is observed:

${\displaystyle P(H_{1}),\dots ,P(H_{n}).\,}$

Multiply these term-by-term by the "likelihood vector":

${\displaystyle P(D\mid H_{1}),\dots ,P(D\mid H_{n}),\,}$

getting

${\displaystyle P(H_{1})P(D\mid H_{1}),\dots ,P(H_{n})P(D\mid H_{n}).\,}$

The sum of these numbers is not (usually) 1. Multiply all of them by the "normalizing constant"

${\displaystyle c={\frac {1}{P(H_{1})P(D\mid H_{1})+\cdots +P(H_{n})P(D\mid H_{n})}},\,}$

getting

${\displaystyle cP(H_{1})P(D\mid H_{1}),\dots ,cP(H_{n})P(D\mid H_{n}).\,}$

The result is the "posterior probabilities", i.e. conditional probabilities given the new data:

${\displaystyle P(H_{1}\mid D),\dots ,P(H_{n}\mid D).\,}$

In epidemiology, Bayes' Theorem is used to obtain the probability of disease in a group of people with some characteristic on the basis of the overall rate of that disease and of the probabilities of that characteristic in healthy and diseased individuals. In clinical decision analysis it is used for estimating the probability of a particular diagnosis given the base rate, and the appearance of some symptoms or test result.[1]

## Bayes' Rule

Bayes theorem can be cast in the following memorable forms:

posterior odds equals prior odds times likelihood ratio, or
posterior odds equals prior odds times Bayes factor.

Consider any two hypotheses, not necessarily exhaustive or mutually exclusive, ${\displaystyle H}$ and ${\displaystyle K}$.

Suppose that initially we assign these hypotheses probabilities ${\displaystyle P(H)}$ and ${\displaystyle P(K)}$ and then observe data ${\displaystyle D}$.

The "prior odds" on the hypotheses ${\displaystyle H}$ and ${\displaystyle K}$ is the ratio ${\displaystyle P(H)/P(K)}$.

The "likelihood ratio" or "Bayes factor" for these two hypotheses, given the data ${\displaystyle D}$, is the ratio of the probabilities of the data under the two hypotheses, ${\displaystyle P(D|H)/P(D|K)}$.

The "posterior odds" on the two hypotheses is the ratio of their probabilities given the data, ${\displaystyle P(H|D)/P(K|D)}$.

And indeed, provided no divisions by zero are involved,

${\displaystyle {\frac {P(H|D)}{P(K|D)}}={\frac {P(H)}{P(K)}}\cdot {\frac {P(D|H)}{P(D|K)}}}$