Bayes Theorem: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Michael Hardy
(Deleting nonsense. That application may be the most familiar to some particular audience to whom these words were originally addressed, but this is for a broader audience.)
imported>Michael Hardy
No edit summary
Line 1: Line 1:
'''Bayes Theorem''' is a theorem in [[probability theory]] named for [[Thomas Bayes]] (1702&ndash;1761). In [[epidemiology]], it is used to obtain the probability of disease in a group of people with some characteristic on the basis of the overall rate of that disease and of the likelihoods of that characteristic in healthy and diseased individuals.  In clinical decision analysis it is used for estimating the probability of a particular diagnosis given the appearance of some symptoms or test result.<ref name="MeSH">{{cite web |url=http://www.nlm.nih.gov/cgi/mesh/2008/MB_cgi?mode= |title=Bayes Theorem |accessdate=2007-12-09 |author=National Library of Medicine |authorlink= |coauthors= |date= |format= |work= |publisher= |pages= |language= |archiveurl= |archivedate= |quote=}}</ref>
'''Bayes' Theorem''' is a theorem in [[probability theory]] named for [[Thomas Bayes]] (1702&ndash;1761).
 
It is used for updating probabilities by finding [[conditional probability|conditional probabilities]] given new data.  This simplest case involves a situation in which probabilities have been assigned to each of several mutually exclusive alternatives ''H''<sub>1</sub>,&nbsp;...,&nbsp;''H''<sub>''n''</sub>, at least one of which may be true.  New data ''D'' is observed.  The conditional probability of ''D'' given each of the alternative hypotheses ''H''<sub>1</sub>,&nbsp;...,&nbsp;''H''<sub>''n''</sub> is known.  What is needed is the conditional probability of each hypothesis ''H''<sub>''i''</sub> given ''D''.  Bayes' Theorem says
 
: <math> P(H_i\mid D) = \frac{P(H_i)P(D\mid H_i)}{P(H_1)P(D\mid H_1)+\cdots+P(H_n)P(D\mid H_n)}. </math>
 
The use of Bayes' Theorem is sometimes described as follows.  Start with the vector of "prior probabilities", i.e. the probabilities of the several hypotheses ''before'' the new data is observed:
 
: <math> P(H_1),\dots,P(H_n).\, </math>
 
Multiply these term-by-term by the "likelihood vector":
 
: <math> P(D\mid H_1),\dots,P(D\mid H_n),\, </math>
 
getting
 
: <math> P(H_1)P(D\mid H_1),\dots,P(H_n)P(D\mid H_n).\, </math>
 
The sum of these numbers is not (usually) 1.  Multiply all of them by the "normalizing constant"
 
: <math> c = (P(H_1)P(D\mid H_1)+\cdots+P(H_n)P(D\mid H_n))^{-1},\, </math>
 
getting
 
: <math> cP(H_1)P(D\mid H_1),\dots,cP(H_n)P(D\mid H_n).\, </math>
 
The result is the "posterior probabilities", i.e. conditional probabilities given the new data:
 
: <math> P(H_1\mid D),\dots,P(H_n\mid D).\, </math>
 
In [[epidemiology]], it is used to obtain the probability of disease in a group of people with some characteristic on the basis of the overall rate of that disease and of the likelihoods of that characteristic in healthy and diseased individuals.  In clinical decision analysis it is used for estimating the probability of a particular diagnosis given the appearance of some symptoms or test result.<ref name="MeSH">{{cite web |url=http://www.nlm.nih.gov/cgi/mesh/2008/MB_cgi?mode= |title=Bayes Theorem |accessdate=2007-12-09 |author=National Library of Medicine |authorlink= |coauthors= |date= |format= |work= |publisher= |pages= |language= |archiveurl= |archivedate= |quote=}}</ref>


==Calculations==
==Calculations==

Revision as of 19:55, 19 December 2007

Bayes' Theorem is a theorem in probability theory named for Thomas Bayes (1702–1761).

It is used for updating probabilities by finding conditional probabilities given new data. This simplest case involves a situation in which probabilities have been assigned to each of several mutually exclusive alternatives H1, ..., Hn, at least one of which may be true. New data D is observed. The conditional probability of D given each of the alternative hypotheses H1, ..., Hn is known. What is needed is the conditional probability of each hypothesis Hi given D. Bayes' Theorem says

The use of Bayes' Theorem is sometimes described as follows. Start with the vector of "prior probabilities", i.e. the probabilities of the several hypotheses before the new data is observed:

Multiply these term-by-term by the "likelihood vector":

getting

The sum of these numbers is not (usually) 1. Multiply all of them by the "normalizing constant"

getting

The result is the "posterior probabilities", i.e. conditional probabilities given the new data:

In epidemiology, it is used to obtain the probability of disease in a group of people with some characteristic on the basis of the overall rate of that disease and of the likelihoods of that characteristic in healthy and diseased individuals. In clinical decision analysis it is used for estimating the probability of a particular diagnosis given the appearance of some symptoms or test result.[1]

Calculations

For more information, see: Sensitivity and specificity.


References

  1. National Library of Medicine. Bayes Theorem. Retrieved on 2007-12-09.