SlideShare une entreprise Scribd logo
1  sur  14
Bayes’ Theorem By SabareeshBabu and Rishabh Kumar
Introduction Shows the relation between one conditional probability and its inverse. Provides a mathematical rule for revising an estimate or forecast in light of experience and observation.  Relates -Prior Probability of A, P(A), is  the probability of event A not concerning its associated event B  - Prior Probability of B, P(B), is the probability of B not concerning A  - Conditional Probability of B given A, P(B│A). Also called the likelihood -Conditional Probability of A given B, P(A│B). Also called the posterior probability.
Simple example of prior, conditional, and posterior probability ,[object Object]
The conditional probability of event C given that A occurs, P(C│A), is 1/6
The posterior probability, P(A│C), is 1/5   What is P(C│B)?          Answer: 0
         Example 1 1% of women at age forty who participate in routine screening have breast cancer.   80% of women with breast cancer will get positive mammographies.   9.6% of women without breast cancer will also get positive mammographies.         A woman in this age group had a positive mammography in a routine screening.  What is the probability that she actually has breast cancer?
Without Bayes’ Theorem Create a large sample size and use probabilities given in the problem to work out the problem.   Assume, for example, that 10,000 women participate in a routine screening for breast cancer.  1%, or 100 women, have breast cancer. 80% of women with breast cancer, 80 women, will get positive mammographies.  9.6%,950 women, of the 9900 women who don’t have breast cancer will also get positive mammographies.  Create a table using the numbers obtained from the assumed sample size and determine the answer.
Without Bayes’ Theorem cont. 1030 women 950 women 80 women 8970 women 8950 women 20 women 10000 women 9900 women 100 women Out of the 1030 women who get positive mammographies only 80 actually have breast cancer, therefore, the probability is 80/1030 or 7.767%
Bayes’ Theorem: The posterior probability is equal to the conditional probability of event B given A multiplied by the prior probability of A, all divided by the prior probability of B.
Using Bayes’ Theorem 1% of women at age forty who participate in routine screening have breast cancer.   P(B)= 0.01  80% of women with breast cancer will get positive mammographies.  P(+│B) = 0.8  9.6% of women without breast cancer will also get positive mammographies.   P(+│B’) = 0.096  A woman in this age group had a positive mammography in a routine screening.  What is the probability that she actually has breast cancer?  Find P(B│+)
Using Bayes’ Theorem cont.  P(B│+) =       P(+│B) P(B)                                          P(+)          P(B), P(+│B), and P(+│B’) are known. P(+) is needed to find P(B│+)  P(+) = P(+│B) P(B) + P(+│B’) P(B’)  P(+) = (0.8) ( 0.01) + (0.096) (0.99)  P(+) = 0.1030  P(B│+) =    (0.8) (0.01)                            (0.1030)  P(B│+) = 0.07767
Example 2 Two different suppliers, A and B, provide a manufacturer with the same part.  All supplies of this part are kept in a large bin.  in the past, 5% of the parts supplied by A and 9% of the parts supplied by B have been defective.  A supplies four times as many parts as B  	Suppose you reach into the bin and select a part, and find it is nondefective.  What is the probability that it was supplied by A?
Solution 5% of the parts supplied by A and 9% of the parts supplied by B have been defective.  P (D│A) = 0.95    P(D│B) = 0.91  A supplies four times as many parts as B  P(A) = 0.8            P(B) = 0.2  Suppose you reach into the bin and select a part, and find it is nondefective.  What is the probability that it was supplied by A? Find P(A│D)
Solution cont.  P(A│D) =   P(D│A) P(A)                                          P(D)   P (D│A) = 0.95             P(A) = 0.8      P(D)= P (D│A) P(A) + P(D│B) P(B)  P(D) = (0.95) (0.8) + (0.91) (0.2) = 0.942  P(A│D) = (0.95) (0.8)                           (0.942)  P(A│D) = 0.8068

Contenu connexe

Tendances

Tendances (20)

Maximum likelihood estimation
Maximum likelihood estimationMaximum likelihood estimation
Maximum likelihood estimation
 
Probability Theory for Data Scientists
Probability Theory for Data ScientistsProbability Theory for Data Scientists
Probability Theory for Data Scientists
 
Bayes network
Bayes networkBayes network
Bayes network
 
Bayesian networks
Bayesian networksBayesian networks
Bayesian networks
 
Basic concepts of probability
Basic concepts of probability Basic concepts of probability
Basic concepts of probability
 
Bayesian networks
Bayesian networksBayesian networks
Bayesian networks
 
Logistic regression
Logistic regressionLogistic regression
Logistic regression
 
Probability&Bayes theorem
Probability&Bayes theoremProbability&Bayes theorem
Probability&Bayes theorem
 
Probability Distributions
Probability DistributionsProbability Distributions
Probability Distributions
 
Poisson distribution
Poisson distributionPoisson distribution
Poisson distribution
 
Exponential probability distribution
Exponential probability distributionExponential probability distribution
Exponential probability distribution
 
Bayes theorem
Bayes theoremBayes theorem
Bayes theorem
 
Bayesian inference
Bayesian inferenceBayesian inference
Bayesian inference
 
Conditional Probability
Conditional ProbabilityConditional Probability
Conditional Probability
 
Mathematical Expectation And Variance
Mathematical Expectation And VarianceMathematical Expectation And Variance
Mathematical Expectation And Variance
 
Bayesian Networks - A Brief Introduction
Bayesian Networks - A Brief IntroductionBayesian Networks - A Brief Introduction
Bayesian Networks - A Brief Introduction
 
Conditional Probability
Conditional ProbabilityConditional Probability
Conditional Probability
 
Logistic Regression Analysis
Logistic Regression AnalysisLogistic Regression Analysis
Logistic Regression Analysis
 
introduction to Probability theory
introduction to Probability theoryintroduction to Probability theory
introduction to Probability theory
 
Lecture: Joint, Conditional and Marginal Probabilities
Lecture: Joint, Conditional and Marginal Probabilities Lecture: Joint, Conditional and Marginal Probabilities
Lecture: Joint, Conditional and Marginal Probabilities
 

Similaire à Bayes Theorem

Complements conditional probability bayes theorem
Complements  conditional probability bayes theorem  Complements  conditional probability bayes theorem
Complements conditional probability bayes theorem Long Beach City College
 
Bayes Classification
Bayes ClassificationBayes Classification
Bayes Classificationsathish sak
 
Complements and Conditional Probability, and Bayes' Theorem
 Complements and Conditional Probability, and Bayes' Theorem Complements and Conditional Probability, and Bayes' Theorem
Complements and Conditional Probability, and Bayes' TheoremLong Beach City College
 
1 Probability Please read sections 3.1 – 3.3 in your .docx
 1 Probability   Please read sections 3.1 – 3.3 in your .docx 1 Probability   Please read sections 3.1 – 3.3 in your .docx
1 Probability Please read sections 3.1 – 3.3 in your .docxaryan532920
 
Introduction to Bayesian Statistics.ppt
Introduction to Bayesian Statistics.pptIntroduction to Bayesian Statistics.ppt
Introduction to Bayesian Statistics.pptLong Dang
 
1a_Bayes_rule.pdf
1a_Bayes_rule.pdf1a_Bayes_rule.pdf
1a_Bayes_rule.pdfDracMihawk
 
It is assumed that the probability of a success, p, is constant over.pdf
It is assumed that the probability of a success, p, is constant over.pdfIt is assumed that the probability of a success, p, is constant over.pdf
It is assumed that the probability of a success, p, is constant over.pdfshankaranilahuja
 
2.statistical DEcision makig.pptx
2.statistical DEcision makig.pptx2.statistical DEcision makig.pptx
2.statistical DEcision makig.pptxImpanaR2
 
1615 probability-notation for joint probabilities
1615 probability-notation for joint probabilities1615 probability-notation for joint probabilities
1615 probability-notation for joint probabilitiesDr Fereidoun Dejahang
 
Basic Concept Of Probability
Basic Concept Of ProbabilityBasic Concept Of Probability
Basic Concept Of Probabilityguest45a926
 
Probability concepts for Data Analytics
Probability concepts for Data AnalyticsProbability concepts for Data Analytics
Probability concepts for Data AnalyticsSSaudia
 
Chapter 4 part4- General Probability Rules
Chapter 4 part4- General Probability RulesChapter 4 part4- General Probability Rules
Chapter 4 part4- General Probability Rulesnszakir
 
NAIVE BAYES CLASSIFIER
NAIVE BAYES CLASSIFIERNAIVE BAYES CLASSIFIER
NAIVE BAYES CLASSIFIERKnoldus Inc.
 
Bayesian statistics for biologists and ecologists
Bayesian statistics for biologists and ecologistsBayesian statistics for biologists and ecologists
Bayesian statistics for biologists and ecologistsMasahiro Ryo. Ph.D.
 
probability_statistics_presentation.pptx
probability_statistics_presentation.pptxprobability_statistics_presentation.pptx
probability_statistics_presentation.pptxvietnam5hayday
 

Similaire à Bayes Theorem (20)

Complements conditional probability bayes theorem
Complements  conditional probability bayes theorem  Complements  conditional probability bayes theorem
Complements conditional probability bayes theorem
 
Bayes Classification
Bayes ClassificationBayes Classification
Bayes Classification
 
Complements and Conditional Probability, and Bayes' Theorem
 Complements and Conditional Probability, and Bayes' Theorem Complements and Conditional Probability, and Bayes' Theorem
Complements and Conditional Probability, and Bayes' Theorem
 
1 Probability Please read sections 3.1 – 3.3 in your .docx
 1 Probability   Please read sections 3.1 – 3.3 in your .docx 1 Probability   Please read sections 3.1 – 3.3 in your .docx
1 Probability Please read sections 3.1 – 3.3 in your .docx
 
Bayesian statistics
Bayesian statisticsBayesian statistics
Bayesian statistics
 
Lecture3
Lecture3Lecture3
Lecture3
 
Introduction to Bayesian Statistics.ppt
Introduction to Bayesian Statistics.pptIntroduction to Bayesian Statistics.ppt
Introduction to Bayesian Statistics.ppt
 
bayesjaw.ppt
bayesjaw.pptbayesjaw.ppt
bayesjaw.ppt
 
Addition rule and multiplication rule
Addition rule and multiplication rule  Addition rule and multiplication rule
Addition rule and multiplication rule
 
1a_Bayes_rule.pdf
1a_Bayes_rule.pdf1a_Bayes_rule.pdf
1a_Bayes_rule.pdf
 
It is assumed that the probability of a success, p, is constant over.pdf
It is assumed that the probability of a success, p, is constant over.pdfIt is assumed that the probability of a success, p, is constant over.pdf
It is assumed that the probability of a success, p, is constant over.pdf
 
Probability
ProbabilityProbability
Probability
 
2.statistical DEcision makig.pptx
2.statistical DEcision makig.pptx2.statistical DEcision makig.pptx
2.statistical DEcision makig.pptx
 
1615 probability-notation for joint probabilities
1615 probability-notation for joint probabilities1615 probability-notation for joint probabilities
1615 probability-notation for joint probabilities
 
Basic Concept Of Probability
Basic Concept Of ProbabilityBasic Concept Of Probability
Basic Concept Of Probability
 
Probability concepts for Data Analytics
Probability concepts for Data AnalyticsProbability concepts for Data Analytics
Probability concepts for Data Analytics
 
Chapter 4 part4- General Probability Rules
Chapter 4 part4- General Probability RulesChapter 4 part4- General Probability Rules
Chapter 4 part4- General Probability Rules
 
NAIVE BAYES CLASSIFIER
NAIVE BAYES CLASSIFIERNAIVE BAYES CLASSIFIER
NAIVE BAYES CLASSIFIER
 
Bayesian statistics for biologists and ecologists
Bayesian statistics for biologists and ecologistsBayesian statistics for biologists and ecologists
Bayesian statistics for biologists and ecologists
 
probability_statistics_presentation.pptx
probability_statistics_presentation.pptxprobability_statistics_presentation.pptx
probability_statistics_presentation.pptx
 

Bayes Theorem

  • 1. Bayes’ Theorem By SabareeshBabu and Rishabh Kumar
  • 2. Introduction Shows the relation between one conditional probability and its inverse. Provides a mathematical rule for revising an estimate or forecast in light of experience and observation. Relates -Prior Probability of A, P(A), is the probability of event A not concerning its associated event B - Prior Probability of B, P(B), is the probability of B not concerning A - Conditional Probability of B given A, P(B│A). Also called the likelihood -Conditional Probability of A given B, P(A│B). Also called the posterior probability.
  • 3.
  • 4. The conditional probability of event C given that A occurs, P(C│A), is 1/6
  • 5. The posterior probability, P(A│C), is 1/5 What is P(C│B)? Answer: 0
  • 6. Example 1 1% of women at age forty who participate in routine screening have breast cancer.  80% of women with breast cancer will get positive mammographies.  9.6% of women without breast cancer will also get positive mammographies.  A woman in this age group had a positive mammography in a routine screening.  What is the probability that she actually has breast cancer?
  • 7. Without Bayes’ Theorem Create a large sample size and use probabilities given in the problem to work out the problem. Assume, for example, that 10,000 women participate in a routine screening for breast cancer. 1%, or 100 women, have breast cancer. 80% of women with breast cancer, 80 women, will get positive mammographies. 9.6%,950 women, of the 9900 women who don’t have breast cancer will also get positive mammographies. Create a table using the numbers obtained from the assumed sample size and determine the answer.
  • 8. Without Bayes’ Theorem cont. 1030 women 950 women 80 women 8970 women 8950 women 20 women 10000 women 9900 women 100 women Out of the 1030 women who get positive mammographies only 80 actually have breast cancer, therefore, the probability is 80/1030 or 7.767%
  • 9. Bayes’ Theorem: The posterior probability is equal to the conditional probability of event B given A multiplied by the prior probability of A, all divided by the prior probability of B.
  • 10. Using Bayes’ Theorem 1% of women at age forty who participate in routine screening have breast cancer.  P(B)= 0.01 80% of women with breast cancer will get positive mammographies. P(+│B) = 0.8 9.6% of women without breast cancer will also get positive mammographies.  P(+│B’) = 0.096 A woman in this age group had a positive mammography in a routine screening.  What is the probability that she actually has breast cancer? Find P(B│+)
  • 11. Using Bayes’ Theorem cont. P(B│+) = P(+│B) P(B) P(+) P(B), P(+│B), and P(+│B’) are known. P(+) is needed to find P(B│+) P(+) = P(+│B) P(B) + P(+│B’) P(B’) P(+) = (0.8) ( 0.01) + (0.096) (0.99) P(+) = 0.1030 P(B│+) = (0.8) (0.01) (0.1030) P(B│+) = 0.07767
  • 12. Example 2 Two different suppliers, A and B, provide a manufacturer with the same part. All supplies of this part are kept in a large bin.  in the past, 5% of the parts supplied by A and 9% of the parts supplied by B have been defective.  A supplies four times as many parts as B Suppose you reach into the bin and select a part, and find it is nondefective.  What is the probability that it was supplied by A?
  • 13. Solution 5% of the parts supplied by A and 9% of the parts supplied by B have been defective. P (D│A) = 0.95 P(D│B) = 0.91 A supplies four times as many parts as B P(A) = 0.8 P(B) = 0.2 Suppose you reach into the bin and select a part, and find it is nondefective.  What is the probability that it was supplied by A? Find P(A│D)
  • 14. Solution cont. P(A│D) = P(D│A) P(A) P(D) P (D│A) = 0.95 P(A) = 0.8 P(D)= P (D│A) P(A) + P(D│B) P(B) P(D) = (0.95) (0.8) + (0.91) (0.2) = 0.942 P(A│D) = (0.95) (0.8) (0.942) P(A│D) = 0.8068
  • 15. Why is Bayes’ Theorem so cool? Describes what makes something "evidence" and how much evidence it is. Science itself is a special case of Bayes’ theorem because you are revising a prior probability (hypothesis) in the light of an observation or experience that confirms your hypothesis (experimental evidence) to develop a posterior probability(conclusion) Used to judge statistical models and widely applicable in computational biology, medicine, computer science, artificial intelligence, etc.
  • 16. Thank You For Your Attention