Assessment |
Biopsychology |
Comparative |
Cognitive |
Developmental |
Language |
Individual differences |
Personality |
Philosophy |
Social |
Methods |
Statistics |
Clinical |
Educational |
Industrial |
Professional items |
World psychology |
Statistics: Scientific method · Research methods · Experimental design · Undergraduate statistics courses · Statistical tests · Game theory · Decision theory
This article defines some terms which characterize probability distributions of two or more variables.
Conditional probability is the probability of some event A, given the occurrence of some other event B. Conditional probability is written P(A|B), and is read "the probability of A, given B".
Joint probability is the probability of two events in conjunction. That is, it is the probability of both events together. The joint probability of A and B is written
orMarginal probability is the probability of one event, regardless of the other event. Marginal probability is obtained by summing (or integrating, more generally) the joint probability over the unrequired event. This is called marginalization. The marginal probability of A is written P(A), and the marginal probability of B is written P(B).
In these definitions, note that there need not be a causal or temporal relation between A and B. A may precede B, or vice versa, or they may happen at the same time. A may cause B, or vice versa, or they may have no causal relation at all.
Conditioning of probabilities, i.e. updating them to take account of (possibly new) information, may be achieved through Bayes' theorem.
Definition
Given events (or subsets) A and B in the sample space (also termed by some textbooks as the universe)
, if it is known that an element randomly drawn from belongs to B, then the probability that it also belongs to A is defined to be the conditional probability of A, given B. From this definition, one can derive the following formulaNow, divide the denominator and numerator by
to obtainEquivalently, we have
Statistical independence
Two random events A and B are statistically independent if and only if
Thus, if A and B are independent, then their joint probability can be expressed as a simple product of their individual probabilities.
Equivalently, for two independent events A and B,
and
In other words, if A and B are independent, then the conditional probability of A, given B is simply the individual probability of A alone; likewise, the probability of B given A is simply the probability of B alone.
Mutual exclusivity
Two events A and B are mutually exclusive if and only if
as long as
and
Then
and
In other words, the probability of A happening, given that B happens, is nil since A and B cannot both happen in the same situation; likewise, the probability of B happening, given that A happens, is also nil.
Other considerations
- If probability measure. is an event and , then the function defined by for all events is a
- If undefined. , then is left
- Conditional probability can be calculated with a decision tree.
The conditional probability fallacy
The conditional probability fallacy is the assumption that P(A|B) is approximately equal to or is influenced by P(B|A). The mathematician John Allen Paulos discusses this in his book Innumeracy, where he points out that it is a mistake often made even by doctors, lawyers, and other highly educated non-statisticians. It can be overcome by describing the data in actual numbers rather than probabilities.
See also
- Bayes' theorem
- Likelihood function
- Posterior probability
- Probability theory
- Monty Hall problem
- Prosecutor's fallacy
- Conditional expectation
![]() |
Probability distributions [[[:Template:Tnavbar-plain-nodiv]]] | |
---|---|---|
Univariate | Multivariate | |
Discrete: | Bernoulli • binomial • Boltzmann • compound Poisson • degenerate • degree • Gauss-Kuzmin • geometric • hypergeometric • logarithmic • negative binomial • parabolic fractal • Poisson • Rademacher • Skellam • uniform • Yule-Simon • zeta • Zipf • Zipf-Mandelbrot | Ewens • multinomial |
Continuous: | Beta • Beta prime • Cauchy • chi-square • Dirac delta function • Erlang • exponential • exponential power • F • fading • Fisher's z • Fisher-Tippett • Gamma • generalized extreme value • generalized hyperbolic • generalized inverse Gaussian • Hotelling's T-square • hyperbolic secant • hyper-exponential • hypoexponential • inverse chi-square • inverse gaussian • inverse gamma • Kumaraswamy • Landau • Laplace • Lévy • Lévy skew alpha-stable • logistic • log-normal • Maxwell-Boltzmann • Maxwell speed • normal (Gaussian) • Pareto • Pearson • polar • raised cosine • Rayleigh • relativistic Breit-Wigner • Rice • Student's t • triangular • type-1 Gumbel • type-2 Gumbel • uniform • Voigt • von Mises • Weibull • Wigner semicircle | Dirichlet • Kent • matrix normal • multivariate normal • von Mises-Fisher • Wigner quasi • Wishart |
Miscellaneous: | Cantor • conditional • exponential family • infinitely divisible • location-scale family • marginal • maximum entropy • phase-type • posterior • prior • quasi • sampling |
de:Bedingte Wahrscheinlichkeit es:Probabilidad condicionada eo:Vikipedio:Projekto matematiko/Kondiĉa probablo fr:Probabilité conditionnelle nl:Voorwaardelijke kans ru:Условная вероятность su:Conditional probability sv:Betingad sannolikhet vi:Xác suất có điều kiện uk:Умовна ймовірність
This page uses Creative Commons Licensed content from Wikipedia (view authors). |