Conditional Probability

Conditional Probability

Conditional probability is defined as the likelihood of an event or outcome occurring, based on the occurrence of a previous event or outcome. This _revised_ probability that an event _A_ has occurred, considering the additional information that another event _B_ has definitely occurred on this trial of the experiment, is called the _conditional probability of_ _A_ _given_ _B_ and is denoted by P(A|B). > _P(B|A) = P(A and B) / P(A)_ > _P(B|A) = P(A∩B) / P(A)_ As another example, suppose a student is applying for admission to a university and hopes to receive an academic scholarship. Conditional probability is calculated by multiplying the probability of the preceding event by the updated probability of the succeeding, or conditional, event. **Conditional probability**: p(A|B) is the probability of event A occurring, given that event B occurs. P (Accepted and dormitory housing) = P (Dormitory Housing | Accepted) P (Accepted) = (0.60)\*(0.80) = 0.48. A conditional probability would look at these two events in relationship with one another, such as the probability that you are both accepted to college, _and_ you are provided with dormitory housing.

Conditional probability refers to the chances that some outcome occurs given that another event has also occurred.

What Is Conditional Probability?

Conditional probability is defined as the likelihood of an event or outcome occurring, based on the occurrence of a previous event or outcome. Conditional probability is calculated by multiplying the probability of the preceding event by the updated probability of the succeeding, or conditional, event.

For example:

A conditional probability would look at these two events in relationship with one another, such as the probability that you are both accepted to college, and you are provided with dormitory housing.

Conditional probability can be contrasted with unconditional probability. Unconditional probability refers to the likelihood that an event will take place irrespective of whether any other events have taken place or any other conditions are present.

Conditional probability refers to the chances that some outcome occurs given that another event has also occurred.
It is often stated as the probability of B given A and is written as P(B|A), where the probability of B depends on that of A happening.
Conditional probability can be contrasted with unconditional probability.

Understanding Conditional Probability

As previously stated, conditional probabilities are contingent on a previous result. It also makes a number of assumptions. For example, suppose you are drawing three marbles — red, blue, and green — from a bag. Each marble has an equal chance of being drawn. What is the conditional probability of drawing the red marble after already drawing the blue one?

First, the probability of drawing a blue marble is about 33% because it is one possible outcome out of three. Assuming this first event occurs, there will be two marbles remaining, with each having a 50% chance of being drawn. So the chance of drawing a blue marble after already drawing a red marble would be about 16.5% (33% x 50%).

As another example to provide further insight into this concept, consider that a fair die has been rolled and you are asked to give the probability that it was a five. There are six equally likely outcomes, so your answer is 1/6. But imagine if before you answer, you get extra information that the number rolled was odd. Since there are only three odd numbers that are possible, one of which is five, you would certainly revise your estimate for the likelihood that a five was rolled from 1/6 to 1/3.

This revised probability that an event A has occurred, considering the additional information that another event B has definitely occurred on this trial of the experiment, is called the conditional probability of A given B and is denoted by P(A|B).

Conditional Probability Formula

P(B|A) = P(A and B) / P(A)

P(B|A) = P(A∩B) / P(A)

Another Example of Conditional Probability

As another example, suppose a student is applying for admission to a university and hopes to receive an academic scholarship. The school to which they are applying accepts 100 of every 1,000 applicants (10%) and awards academic scholarships to 10 of every 500 students who are accepted (2%). Of the scholarship recipients, 50% of them also receive university stipends for books, meals, and housing. For our ambitious student, the chance of them being accepted then receiving a scholarship is .2% (.1 x .02). The chance of them being accepted, receiving the scholarship, then also receiving a stipend for books, etc. is .1% (.1 x .02 x .5). (You can also check out Bayes' Theorem.)

Conditional Probability vs. Joint Probability and Marginal Probability

Conditional probability: p(A|B) is the probability of event A occurring, given that event B occurs. Example: given that you drew a red card, what’s the probability that it’s a four (p(four|red))=2/26=1/13. So out of the 26 red cards (given a red card), there are two fours so 2/26=1/13.

Marginal probability: the probability of an event occurring (p(A)), it may be thought of as an unconditional probability. It is not conditioned on another event. Example: the probability that a card drawn is red (p(red) = 0.5). Another example: the probability that a card drawn is a 4 (p(four)=1/13).

Joint probability: p(A and B). The probability of event A and event B occurring. It is the probability of the intersection of two or more events. The probability of the intersection of A and B may be written p(A ∩ B). Example: the probability that a card is a four and red =p(four and red) = 2/52=1/26. (There are two red fours in a deck of 52, the 4 of hearts and the 4 of diamonds).

Bayes' Theorem

Bayes' theorem, named after 18th-century British mathematician Thomas Bayes, is a mathematical formula for determining conditional probability. The theorem provides a way to revise existing predictions or theories (update probabilities) given new or additional evidence. In finance, Bayes' theorem can be used to rate the risk of lending money to potential borrowers.

Bayes' theorem is also called Bayes' Rule or Bayes' Law and is the foundation of the field of Bayesian statistics. This set of rules of probability allows one to update their predictions of events occurring based on new information that has been received, making for better and more dynamic estimates.

Related terms:

A Priori Probability & Example

A priori probability is a likelihood of occurrence that can be deduced logically by examining existing information.  read more

Bayes' Theorem

Bayes' theorem is a mathematical formula for determining conditional probability. read more

Compound Probability

Compound probability is a mathematical term relating to the likeliness of two independent events occurring. read more

Joint Probability

Joint probability is a statistical measure that calculates the likelihood of two events occurring together and at the same point in time. Joint probability is the probability of event Y occurring at the same time that event X occurs. read more

Prior Probability

A prior probability, in Bayesian statistical inference, is the probability of an event based on established knowledge, before empirical data is collected. read more

Risk

Risk takes on many forms but is broadly categorized as the chance an outcome or investment's actual return will differ from the expected outcome or return. read more

Unconditional Probability

An unconditional probability is an independent chance that a single outcome results from a sample of possible outcomes. read more

Uniform Distribution

Uniform distribution is a type of probability distribution in which all outcomes are equally likely. Learn how to calculate uniform distribution. read more