
Prior Probability
Prior probability, in Bayesian statistical inference, is the probability of an event before new data is collected. Baye’s theorem is a very common and fundamental theorem used in data mining and machine learning. P ( A ∣ B ) \= P ( A ∩ B ) P ( B ) \= P ( A ) × P ( B ∣ A ) P ( B ) where: P ( A ) \= the prior probability of A occurring P ( A ∣ B ) \= the conditional probability of A given that B occurs P ( B ∣ A ) \= the conditional probability of B given that A occurs \\begin{aligned}&P(A\\mid B)\\ =\\ \\frac{P(A\\cap B)}{P(B)}\\ = \\ \\frac{P(A)\\ \\times\\ P(B\\mid A)}{P(B)}\\\\&\\textbf{where:}\\\\&P(A)\\ =\\ \\text{the prior probability of }A\\text{ occurring}\\\\&P(A\\mid B)=\\ \\text{the conditional probability of }A\\\\&\\qquad\\qquad\\quad\\ \\text{ given that }B\\text{ occurs}\\\\&P(B\\mid A)\\ = \\ \\text{the conditional probability of }B\\\\&\\qquad\\qquad\\quad\\ \\ \\text{ given that }A\\text{ occurs}\\\\&P(B)\\ =\\ \\text{the probability of }B\\text{ occurring}\\end{aligned} P(A∣B) \= P(B)P(A∩B) \= P(B)P(A) × P(B∣A)where:P(A) \= the prior probability of A occurringP(A∣B)\= the conditional probability of A given that B occursP(B∣A) \= the conditional probability of B given that A occurs If we are interested in the probability of an event of which we have prior observations; we call this the prior probability. In statistical terms, the posterior probability is the probability of event A occurring given that event B has occurred. If there is a second event that affects P(A), which we'll call event B, then we want to know what the probability of A is given B has occurred. But if a drilling test is conducted on acre B, and the results indicate that no oil is present at the location, then the posterior probability of oil being found on acres A and C become 0.5, as each acre has one out of two chances.
What Is Prior Probability?
Prior probability, in Bayesian statistical inference, is the probability of an event before new data is collected. This is the best rational assessment of the probability of an outcome based on the current knowledge before an experiment is performed.
Prior Probability Explained
The prior probability of an event will be revised as new data or information becomes available, to produce a more accurate measure of a potential outcome. That revised probability becomes the posterior probability and is calculated using Bayes' theorem. In statistical terms, the posterior probability is the probability of event A occurring given that event B has occurred.
For example, three acres of land have the labels A, B, and C. One acre has reserves of oil below its surface, while the other two do not. The prior probability of oil being found on acre C is one third, or 0.333. But if a drilling test is conducted on acre B, and the results indicate that no oil is present at the location, then the posterior probability of oil being found on acres A and C become 0.5, as each acre has one out of two chances.
Baye’s theorem is a very common and fundamental theorem used in data mining and machine learning.
P ( A ∣ B ) = P ( A ∩ B ) P ( B ) = P ( A ) × P ( B ∣ A ) P ( B ) where: P ( A ) = the prior probability of A occurring P ( A ∣ B ) = the conditional probability of A given that B occurs P ( B ∣ A ) = the conditional probability of B given that A occurs \begin{aligned}&P(A\mid B)\ =\ \frac{P(A\cap B)}{P(B)}\ = \ \frac{P(A)\ \times\ P(B\mid A)}{P(B)}\\&\textbf{where:}\\&P(A)\ =\ \text{the prior probability of }A\text{ occurring}\\&P(A\mid B)=\ \text{the conditional probability of }A\\&\qquad\qquad\quad\ \text{ given that }B\text{ occurs}\\&P(B\mid A)\ = \ \text{the conditional probability of }B\\&\qquad\qquad\quad\ \ \text{ given that }A\text{ occurs}\\&P(B)\ =\ \text{the probability of }B\text{ occurring}\end{aligned} P(A∣B) = P(B)P(A∩B) = P(B)P(A) × P(B∣A)where:P(A) = the prior probability of A occurringP(A∣B)= the conditional probability of A given that B occursP(B∣A) = the conditional probability of B given that A occurs
If we are interested in the probability of an event of which we have prior observations; we call this the prior probability. We'll deem this event A, and its probability P(A). If there is a second event that affects P(A), which we'll call event B, then we want to know what the probability of A is given B has occurred. In probabilistic notation, this is P(A|B), and is known as posterior probability or revised probability. This is because it has occurred after the original event, hence the post in posterior. This is how Baye’s theorem uniquely allows us to update our previous beliefs with new information.
Related terms:
Bayes' Theorem
Bayes' theorem is a mathematical formula for determining conditional probability. read more
Conditional Probability
Conditional probability is the chances of an event or outcome that is itself based on the occurrence of some other previous event or outcome. read more
Data Mining
Data mining is a process used by companies to turn raw data into useful information by using software to look for patterns in large batches of data. read more
Joint Probability
Joint probability is a statistical measure that calculates the likelihood of two events occurring together and at the same point in time. Joint probability is the probability of event Y occurring at the same time that event X occurs. read more
Machine Learning
Machine learning, a field of artificial intelligence (AI), is the idea that a computer program can adapt to new data independently of human action. read more
Posterior Probability
Posterior probability is the revised probability of an event occurring after taking into consideration new information. read more
T-Test
A t-test is a type of inferential statistic used to determine if there is a significant difference between the means of two groups, which may be related in certain features. read more
Unconditional Probability
An unconditional probability is an independent chance that a single outcome results from a sample of possible outcomes. read more