SlideShare une entreprise Scribd logo
1  sur  62
Probability Review
            Tomoki Tsuchida
Computational & Cognitive Neuroscience Lab
    Department of Computer Science
    University of California, San Diego
Lectures based on previous bootcamp lectures/
slides, first written by Tim K. Marks (Thanks Tim!!)
   and then by David Groppe (Thanks David!!)
    and then by Jake      Olson (Thanks Jake!!)
Talk Outline
• Why study probability?
• Probability defined
• Probabilities of Events Formed from other
     Events
 ‣       P(not E)
 ‣       P(E or F)
 ‣       P(E & F)
     -    Independent events

• Conditional Probability
Why do Cognitive Scientists
    need probability?
Why do Cognitive Scientists
    need probability?

 The answer: Uncertainty!
Everyday Perceptual Uncertainty




          Artist: Julian Beever

http://users.skynet.be/J.Beever/pave.htm
Everyday Perceptual Uncertainty




          Artist: Julian Beever

http://users.skynet.be/J.Beever/pave.htm
Uncertainty in Scientific Perception
Talk Outline
• Why is probability useful?
• Probability Defined
• Probabilities of Events Formed from other
     Events
 ‣       P(not E)
 ‣       P(E or F)
 ‣       P(E & F)
     -    Independent events

• Conditional Probability
We now know Probability is useful...

But what does it mean?

Examples:

“There is a 47% chance of red.”



“There is a 47% chance of rain.”
Two of the ways to interpret the probability:




   (Philosophical debates ensure for either interpretation. Or, one can forget about the
        interpretation and focus on the properties alone, as mathematicians do :P)
Two of the ways to interpret the probability:

   Frequentist-The percentage of
   time some event will happen.




   (Philosophical debates ensure for either interpretation. Or, one can forget about the
        interpretation and focus on the properties alone, as mathematicians do :P)
Two of the ways to interpret the probability:

   Frequentist-The percentage of
   time some event will happen.

  Bayesian-How strongly
  you believe that something
  is true.


   (Philosophical debates ensure for either interpretation. Or, one can forget about the
        interpretation and focus on the properties alone, as mathematicians do :P)
Probability based on Axiomatic
            Theory
Probability based on Axiomatic
            Theory
Random Experiment: An experiment with
stochastic (nondeterministic) result
(e.g. “two coin tosses”)
Probability based on Axiomatic
            Theory
Random Experiment: An experiment with
stochastic (nondeterministic) result
(e.g. “two coin tosses”)
Outcome: Result of the experiment
(e.g. HH, TH etc.)
Probability based on Axiomatic
            Theory
Random Experiment: An experiment with
stochastic (nondeterministic) result
(e.g. “two coin tosses”)
Outcome: Result of the experiment
(e.g. HH, TH etc.)
Sample Space (Ω): Set of all possible
outcomes of the random experiment
(Ω = {HH, TH, HT, TT} )
Probability based on Axiomatic
            Theory
Random Experiment: An experiment with
stochastic (nondeterministic) result
(e.g. “two coin tosses”)
Outcome: Result of the experiment
(e.g. HH, TH etc.)
Sample Space (Ω): Set of all possible
outcomes of the random experiment
(Ω = {HH, TH, HT, TT} )
Event: A subset of the sample space that
satisfies certain constraints
(e.g. E={HH}, F={HH, TH}, G={HH, TT})
An Example Random Experiment:
     Rolling a six-sided die once
       Outcomes: 1, 2, 3, 4, 5, or 6
      Sample Space: Ω={1,2,3,4,5,6}
         Events: 26 total (why?)

                   note:
    The sample space itself Ω (Omega)
     and the empty set ϕ (phi) or {}
            ...are also events.
Complement of an Event


          Odd numbers on the die: E={1,3,5}
                                   c
    The complement of E, E, are all elements
          that are not members of E:
                    c
                   E ={2,4,6}
Draw Venn diagrams


                (Remember: events are sets!
                     c
                 So E is also an event itself.)
Union of Two Events


   Odd numbers on the die: E={1,3,5}
    Numbers less than 4: F={1,2,3}

The union of E and F are all elements that
  are members of E or members of F:
         E or F=E ∪ F={1,2,3,5}
Intersection of Two Events


 Odd numbers on the die: E={1,3,5}
  Numbers less than 4: F={1,2,3}

  The intersection of E and F are all
elements that are members of E and
            members of F:
          E & F=E ∩ F={1,3}
An Axiomatic Definition of
               Probability
Probability is a set function P(E) that assigns to every event E
     a number called the “probability of E” such that:
An Axiomatic Definition of
               Probability
Probability is a set function P(E) that assigns to every event E
     a number called the “probability of E” such that:

1. The probability of an event is greater than or equal to zero
An Axiomatic Definition of
               Probability
Probability is a set function P(E) that assigns to every event E
     a number called the “probability of E” such that:

1. The probability of an event is greater than or equal to zero
                          P(E) ≥ 0



          €
An Axiomatic Definition of
               Probability
Probability is a set function P(E) that assigns to every event E
     a number called the “probability of E” such that:

1. The probability of an event is greater than or equal to zero
                          P(E) ≥ 0
        2. The probability of the sample space is one



          €
An Axiomatic Definition of
               Probability
Probability is a set function P(E) that assigns to every event E
     a number called the “probability of E” such that:

1. The probability of an event is greater than or equal to zero
                          P(E) ≥ 0
        2. The probability of the sample space is one
                          P(Ω) = 1
          €
An Axiomatic Definition of
               Probability
Probability is a set function P(E) that assigns to every event E
     a number called the “probability of E” such that:

1. The probability of an event is greater than or equal to zero
                          P(E) ≥ 0
        2. The probability of the sample space is one
                          P(Ω) = 1
  3. If two events are disjoint, the probability of their union
          € equals the sum of their probabilities
An Axiomatic Definition of
               Probability
Probability is a set function P(E) that assigns to every event E
     a number called the “probability of E” such that:

1. The probability of an event is greater than or equal to zero
                          P(E) ≥ 0
        2. The probability of the sample space is one
                          P(Ω) = 1
  3. If two events are disjoint, the probability of their union
          € equals the sum of their probabilities
               P(E or F) = P(E) + P(F)
An Example Random Experiment:
     Rolling a six-sided die once

 A legal probability function:
    P(1)=P(2)=P(3)=P(4)=P(5)=P(6)=1/6

 An illegal probability function:
   P(1)=P(2)=P(3)=P(4)=P(5)=P(6)=1/2

   P(Ω)=P(1)+P(2)+P(3)+P(4)+P(5)+P(6)=3
Talk Outline
• Probability Defined
• Probabilities of Events Formed from other
     Events
 ‣       P(not E)
 ‣       P(E or F)
 ‣       P(E & F)
     -    Independent events

• Conditional Probability
 ‣       Bayes’ Rule
Probability of Events from other
     Events: Complement of an event
                      c
                  P(E )=P(not E)=1-P(E)



                             c
                  E={1,3},E ={2,4,5,6}
                   c
                P(E )=1-P(E)=1-1/3=2/3



Note: This is easy to visualize with Venn Diagrams
Probability of Events from other
  Events: Intersection of two events
         P(E ∩ F)=P(E & F)=P(E)P(F)
         If and only if E and F are
            independent events


         E={1,3,5}, F={1,2,3,4}
             E ∩ F={1,3}
Probability of Events from other
  Events: Intersection of two events
         P(E ∩ F)=P(E & F)=P(E)P(F)
         If and only if E and F are
            independent events


          E={1,3,5}, F={1,2,3,4}
               E ∩ F={1,3}
           P(E)=1/2, P(F)=2/3
         P(E ∩ F)=1/3=P(E)P(F)
Probability of Events from other
    Events: Union of two events
  P(E ∪ F)=P(E or F)=P(E)+P(F)-P(E ∩ F)


            E={1,3}, F={1,2,3}
        E ∪ F={1,2,3}, E ∩ F={1,3}
Probability of Events from other
    Events: Union of two events
   P(E ∪ F)=P(E or F)=P(E)+P(F)-P(E ∩ F)


             E={1,3}, F={1,2,3}
         E ∪ F={1,2,3}, E ∩ F={1,3}

             P(E)=1/3, P(F)=1/2
P(E ∪ F)=P(E)+P(F)-P(E ∩ F)=1/3-1/2-1/3=1/2
Probability of Events from other
  Events: Intersection of two events
         P(E ∩ F)=P(E & F)



          E={1,3}, F={1,2,3,4}
              E ∩ F={1,3}
           P(E)=2/6, P(F)=4/6
              P(E ∩ F)=2/6
Probability of Events from other
  Events: Intersection of two events
         P(E ∩ F)=P(E)P(F)
         If and only if E and F are
            independent events

          E={1,3}, F={1,2,3,4}
              E ∩ F={1,3}
           P(E)=2/6, P(F)=4/6
         P(E ∩ F)=2/6≠P(E)P(F)
Understanding Independence

                P(E ∩ F)=P(E)P(F)
               Independent events give no
                information about each other.

            E={1,3,5}, F={1,2,3,4}
                  E ∩ F={1,3}
       P(E)=1/2, P(F)=2/3, P(E & F)=1/3
      P(E)P(F)=(1/2)/(2/3)=1/3=P(E & F)

note: this is different from disjoint events, for which P(E)P(F) = 0 !
Talk Outline
• Probability Defined
• Probabilities of Events Formed from other
     Events
 ‣       P(not E)
 ‣       P(E or F)
 ‣       P(E & F)
     -    Independent events

• Conditional Probability
 ‣       Bayes’ Rule
If I tell you that I rolled a number less than 4, what is the
            probability that I rolled an odd number?


Conditional Probability: The probability of event E
given that event F has happened
                                P(E & F)
                     P(E | F) =
                                  P(F)

                   E={1,3,5}, F={1,2,3}
            €         E ∩ F={1,3}
If I tell you that I rolled a number less than 4, what is the
            probability that I rolled an odd number?


Conditional Probability: The probability of event E
given that event F has happened
                                P(E & F)
                     P(E | F) =
                                  P(F)

                   E={1,3,5}, F={1,2,3}
            €         E ∩ F={1,3}
                 P(E & F)=1/3, P(F)=1/2
                 P(E|F)=(1/3)/(1/2)=2/3
Conditional Probability
       What if E and F are independent events?
                    P(E & F) P(E)P(F)
         P(E | F) =         =         = P(E)
                      P(F)     P(F)



€

    same example of
    independent variables
    that I used before
Conditional Probability
      What if E and F are independent events?
                   P(E & F) P(E)P(F)
        P(E | F) =         =         = P(E)
                     P(F)     P(F)
    If I tell you that I rolled a number less than 5, what is the
               probability that I rolled an odd number?
€
                                E={1,3,5}, F={1,2,3,4}
    same example of                  E ∩ F={1,3}
    independent variables
    that I used before    P(E)=1/2, P(F)=2/3, P(E & F)=1/3
                            P(E | F)=(1/3)/(2/3)=1/2=P(E)
The multiplication rule
Rearranging the definition of conditional probability,
we get:
              P (E&F ) = P (E|F )P (F )
The multiplication rule
Rearranging the definition of conditional probability,
we get:
              P (E&F ) = P (E|F )P (F )

If we have more events, say E, F, G...
(and change & to , for simpler notation):
The multiplication rule
Rearranging the definition of conditional probability,
we get:
              P (E&F ) = P (E|F )P (F )

If we have more events, say E, F, G...
(and change & to , for simpler notation):
          P (E, F, G) = P (E, F |G)P (G)
                     = P (E|F, G)P (F |G)P (G)
The multiplication rule
Rearranging the definition of conditional probability,
we get:
              P (E&F ) = P (E|F )P (F )

If we have more events, say E, F, G...
(and change & to , for simpler notation):
          P (E, F, G) = P (E, F |G)P (G)
                     = P (E|F, G)P (F |G)P (G)

If you forget everything else today about conditional
probability, just remember:
The multiplication rule
Rearranging the definition of conditional probability,
we get:
              P (E&F ) = P (E|F )P (F )

If we have more events, say E, F, G...
(and change & to , for simpler notation):
          P (E, F, G) = P (E, F |G)P (G)
                     = P (E|F, G)P (F |G)P (G)

If you forget everything else today about conditional
probability, just remember:
            P (EF ) = P (E|F )P (F )
The Wrong Conditioning
               Variable:

“The CASA (Center for Addiction and Substance Abuse at Columbia)
study establishes a clear progression that begins with gateway drugs and
leads to cocaine use: nearly 90% of people who have ever tried cocaine
used all three gateway substances [alcohol, marijuana, & cigarettes]
first.”

 Source: http://www.columbia.edu/cu/record/archives/vol20/
               vol20_iss10/record2010.24.html



                                  29
Extra: Monty Hall problem
Suppose you're on a game show and you're given the choice of three doors [and will win
what is behind the chosen door]. Behind one door is a car; behind the others, goats. The car
and the goats were placed randomly behind the doors before the show. The rules of the
game show are as follows: After you have chosen a door, the door remains closed for the
time being. The game show host, Monty Hall, who knows what is behind the doors, now has
to open one of the two remaining doors, and the door he opens must have a goat behind it. If
both remaining doors have goats behind them, he chooses one [uniformly] at random. After
Monty Hall opens a door with a goat, he will ask you to decide whether you want to stay with
your first choice or to switch to the last remaining door. Imagine that you chose Door 1 and
the host opens Door 3, which has a goat. He then asks you "Do you want to switch to Door
Number 2?" Is it to your advantage to change your choice?

- Krauss and Wang 2003:10




                                             30
Talk Outline
• Probability Defined
• Probabilities of Events Formed from other
     Events
 ‣       P(not E)
 ‣       P(E or F)
 ‣       P(E & F)
     -    Independent events

• Conditional Probability
 ‣       Bayes’ Rule
Bayes’ Rule
                (a.k.a. Bayes’ Theorem)


Note that the multiplication rule is symmetric, so

    P (E, F ) = P (E|F )P (F ) = P (F |E)P (E)

Dividing one of them by P(F) yields

                         P (F |E)P (E)
              P (E|F ) =
                             P (F )
Bayes’ Rule
                (a.k.a. Bayes’ Theorem)

  A way to infer one conditional probability
   from another (probabilistic inference):

                       P (D|H)P (H)
             P (H|D) =
                           P (D)
Prior: P (H)                  (“prior” belief about H)
Likelihood: P (D|H) (Under, H how likely it is to see D)
Posterior: P (H|D)          (“updated” belief about H
                                 after seeing D)
Note: Named after Rev. Thomas Bayes (1702-1761)
Marginalization
How do we calculate P(F)? Note that
                                   {
          E = (E  F ) [ (E  F )

So that

    P (E) = P (EF ) + P (EF { )
                                       {      {
            = P (E|F )P (F ) + P (E|F )P (F )
            = P (E|F )P (F ) + P (E|F { )(1   P (F ))
Marginalization
So for Bayes’ rule for two hypotheses is




                  P (D|H)P (H)
P (H|D) =
          P (D|H)P (H) + P (D|H { )P (H { )


(phew!)
Exercise

• A laboratory blood test is 95% effective in
  detecting a disease when it is, in fact,
  present. However, the test also yields a “false
  positive” result for 1 percent of the healthy
  persons tested. If 0.5 percent of the
  population actually has the disease, what is
  the probability a person has the disease
  given that the test result is positive?

  (from Ross, Section 3.3 example 3d)
Solution

• D: the event that the tested person has the
  disease.
• E: the event that the test result is positive.
• We know: P(E | D) = 0.95, P(D) = .005.
• We want to know P(D | E)
Solution

          P (D, E)
P (D|E) =
           P (E)
                     P (E|D)P (D)
         =
           P (E|D)P (D) + P (E|D{ )P (D{ )
                  (.95)(.005)
         =
           (.95)(.005) + (.01)(.995)
         ⇡ .323

    (Why is it so low?)
Does the Brain Perform Bayesian Inference?
                    Opinion                TRENDS in Neurosciences   Vol.27 No.12 December 2004




   The Bayesian brain: the role of
   uncertainty in neural coding and
   computation
   David C. Knill and Alexandre Pouget
   Center for Visual Science and the Department of Brain and Cognitive Science, University of Rochester, NY 14627, USA



   To use sensory information efficiently to make judgments            Bayesian inference and the Bayesian coding hypothesis
   and guide action in the world, the brain must represent            The fundamental concept behind the Bayesian approach
   and use information about uncertainty in its computations          to perceptual computations is that the information
   for perception and action. Bayesian methods have proven            provided by a set of sensory data about the world is
   successful in building computational theories for percep-          represented by a conditional probability density function
   tion and sensorimotor control, and psychophysics is                over the set of unknown variables – the posterior density
   providing a growing body of evidence that human                    function. A Bayesian perceptual system, therefore, would
   perceptual computations are ‘Bayes’ optimal’. This leads           represent the perceived depth of an object, for example,
   to the ‘Bayesian coding hypothesis’: that the brain                not as a single number Z but as a conditional probability
   represents sensory information probabilistically, in the           density function p(Z/I), where I is the available image
   form of probability distributions. Several computational           information (e.g. stereo disparities). Loosely speaking,
   schemes have recently been proposed for how this might             p(Z/I) would specify the relative probability that the object
   be achieved in populations of neurons. Neurophysio-                is at different depths Z, given the available sensory
   logical data on the hypothesis, however, is almost non-            information.
   existent. A major challenge for neuroscientists is to test            More generally, the component computations that
   these ideas experimentally, and so determine whether               underlay Bayesian inferences [that give rise to p(Z/I)]
   and how neurons code information about sensory                     are ideally performed on representations of conditional
   uncertainty.                                                       probability density functions rather than on unitary
                                                                      estimates of parameter values. Loosely speaking, a
   Humans and other animals operate in a world of sensory             Bayes’ optimal system maintains, at each stage of local
   uncertainty. Although introspection tells us that percep-          computation, a representation of all possible values of the
Does the Brain Perform Bayesian Inference?


  •       An appealing idea because:
      ‣    It could explain why the brain works as it
           does (i.e., it is performing optimal Bayesian
           inference)
  •       An unappealing idea because:
      ‣    For even some simple problems it can be
           difficult to know what the optimal Bayesian
           inference is
      ‣    Computation of probabilities can be difficult
Talk Outline



‣   Next session: Random Variables etc.

Contenu connexe

En vedette

Subjective probability
Subjective probabilitySubjective probability
Subjective probabilityTanuj Gupta
 
Hypergeometric Distribution
Hypergeometric DistributionHypergeometric Distribution
Hypergeometric Distributionmathscontent
 
Hypergeometric distribution
Hypergeometric distributionHypergeometric distribution
Hypergeometric distributionmohammad nouman
 
Risk In Our Society
Risk In Our SocietyRisk In Our Society
Risk In Our Societydaryl10
 
Mba i qt unit-4_probability and probability distributions
Mba i qt unit-4_probability and probability distributionsMba i qt unit-4_probability and probability distributions
Mba i qt unit-4_probability and probability distributionsRai University
 
12.8 independent and dependent events 1
12.8 independent and dependent events   112.8 independent and dependent events   1
12.8 independent and dependent events 1bweldon
 
Introduction of Probability
Introduction of ProbabilityIntroduction of Probability
Introduction of Probabilityrey castro
 
Probability - Independent & Dependent Events
Probability - Independent & Dependent EventsProbability - Independent & Dependent Events
Probability - Independent & Dependent EventsBitsy Griffin
 
Poisson distribution
Poisson distributionPoisson distribution
Poisson distributionAntiqNyke
 
Set Theory
Set TheorySet Theory
Set Theoryitutor
 

En vedette (13)

Subjective probability
Subjective probabilitySubjective probability
Subjective probability
 
Hypergeometric Distribution
Hypergeometric DistributionHypergeometric Distribution
Hypergeometric Distribution
 
Hypergeometric distribution
Hypergeometric distributionHypergeometric distribution
Hypergeometric distribution
 
Risk In Our Society
Risk In Our SocietyRisk In Our Society
Risk In Our Society
 
Tp4 probability
Tp4 probabilityTp4 probability
Tp4 probability
 
Mba i qt unit-4_probability and probability distributions
Mba i qt unit-4_probability and probability distributionsMba i qt unit-4_probability and probability distributions
Mba i qt unit-4_probability and probability distributions
 
12.8 independent and dependent events 1
12.8 independent and dependent events   112.8 independent and dependent events   1
12.8 independent and dependent events 1
 
Introduction of Probability
Introduction of ProbabilityIntroduction of Probability
Introduction of Probability
 
Probability distributions & expected values
Probability distributions & expected valuesProbability distributions & expected values
Probability distributions & expected values
 
Probability - Independent & Dependent Events
Probability - Independent & Dependent EventsProbability - Independent & Dependent Events
Probability - Independent & Dependent Events
 
Poisson distribution
Poisson distributionPoisson distribution
Poisson distribution
 
Set Theory
Set TheorySet Theory
Set Theory
 
Hypergeometric Distribution
Hypergeometric DistributionHypergeometric Distribution
Hypergeometric Distribution
 

Similaire à Probability Review

Making probability easy!!!
Making probability easy!!!Making probability easy!!!
Making probability easy!!!GAURAV SAHA
 
Probability Basics and Bayes' Theorem
Probability Basics and Bayes' TheoremProbability Basics and Bayes' Theorem
Probability Basics and Bayes' TheoremMenglinLiu1
 
Statistical computing 1
Statistical computing 1Statistical computing 1
Statistical computing 1Padma Metta
 
introduction to Probability theory
introduction to Probability theoryintroduction to Probability theory
introduction to Probability theoryRachna Gupta
 
1616 probability-the foundation of probability theory
1616 probability-the foundation of probability theory1616 probability-the foundation of probability theory
1616 probability-the foundation of probability theoryDr Fereidoun Dejahang
 
4.1-4.2 Sample Spaces and Probability
4.1-4.2 Sample Spaces and Probability4.1-4.2 Sample Spaces and Probability
4.1-4.2 Sample Spaces and Probabilitymlong24
 
NAVEEN KUMAR CONDITIONAL PROBABILITY COLLEGE ROLL-237094034 (1).pptx
NAVEEN KUMAR CONDITIONAL PROBABILITY COLLEGE ROLL-237094034 (1).pptxNAVEEN KUMAR CONDITIONAL PROBABILITY COLLEGE ROLL-237094034 (1).pptx
NAVEEN KUMAR CONDITIONAL PROBABILITY COLLEGE ROLL-237094034 (1).pptxBARUNSINGH43
 
Probabilistic decision making
Probabilistic decision makingProbabilistic decision making
Probabilistic decision makingshri1984
 
STOMA FULL SLIDE (probability of IISc bangalore)
STOMA FULL SLIDE (probability of IISc bangalore)STOMA FULL SLIDE (probability of IISc bangalore)
STOMA FULL SLIDE (probability of IISc bangalore)2010111
 
Statistical Analysis with R -II
Statistical Analysis with R -IIStatistical Analysis with R -II
Statistical Analysis with R -IIAkhila Prabhakaran
 
Cheatsheet probability
Cheatsheet probabilityCheatsheet probability
Cheatsheet probabilityAshish Patel
 
Pr(E) shows the probability of only event E happening.Pr(F) shows .pdf
Pr(E) shows the probability of only event E happening.Pr(F) shows .pdfPr(E) shows the probability of only event E happening.Pr(F) shows .pdf
Pr(E) shows the probability of only event E happening.Pr(F) shows .pdfanushasarees
 
Probability PART 1 - X NCERT
Probability PART 1 - X NCERTProbability PART 1 - X NCERT
Probability PART 1 - X NCERTSudheerVenkat2
 
Probability basics and bayes' theorem
Probability basics and bayes' theoremProbability basics and bayes' theorem
Probability basics and bayes' theoremBalaji P
 

Similaire à Probability Review (20)

Probability
ProbabilityProbability
Probability
 
Making probability easy!!!
Making probability easy!!!Making probability easy!!!
Making probability easy!!!
 
Probability Theory 8
Probability Theory 8Probability Theory 8
Probability Theory 8
 
Probability Basics and Bayes' Theorem
Probability Basics and Bayes' TheoremProbability Basics and Bayes' Theorem
Probability Basics and Bayes' Theorem
 
Statistical computing 1
Statistical computing 1Statistical computing 1
Statistical computing 1
 
introduction to Probability theory
introduction to Probability theoryintroduction to Probability theory
introduction to Probability theory
 
1616 probability-the foundation of probability theory
1616 probability-the foundation of probability theory1616 probability-the foundation of probability theory
1616 probability-the foundation of probability theory
 
4.1-4.2 Sample Spaces and Probability
4.1-4.2 Sample Spaces and Probability4.1-4.2 Sample Spaces and Probability
4.1-4.2 Sample Spaces and Probability
 
Probability +2
Probability +2Probability +2
Probability +2
 
NAVEEN KUMAR CONDITIONAL PROBABILITY COLLEGE ROLL-237094034 (1).pptx
NAVEEN KUMAR CONDITIONAL PROBABILITY COLLEGE ROLL-237094034 (1).pptxNAVEEN KUMAR CONDITIONAL PROBABILITY COLLEGE ROLL-237094034 (1).pptx
NAVEEN KUMAR CONDITIONAL PROBABILITY COLLEGE ROLL-237094034 (1).pptx
 
Probabilistic decision making
Probabilistic decision makingProbabilistic decision making
Probabilistic decision making
 
STOMA FULL SLIDE (probability of IISc bangalore)
STOMA FULL SLIDE (probability of IISc bangalore)STOMA FULL SLIDE (probability of IISc bangalore)
STOMA FULL SLIDE (probability of IISc bangalore)
 
Statistical Analysis with R -II
Statistical Analysis with R -IIStatistical Analysis with R -II
Statistical Analysis with R -II
 
Cheatsheet probability
Cheatsheet probabilityCheatsheet probability
Cheatsheet probability
 
Chapter6
Chapter6Chapter6
Chapter6
 
Probability.pptx
Probability.pptxProbability.pptx
Probability.pptx
 
Pr(E) shows the probability of only event E happening.Pr(F) shows .pdf
Pr(E) shows the probability of only event E happening.Pr(F) shows .pdfPr(E) shows the probability of only event E happening.Pr(F) shows .pdf
Pr(E) shows the probability of only event E happening.Pr(F) shows .pdf
 
Probability PART 1 - X NCERT
Probability PART 1 - X NCERTProbability PART 1 - X NCERT
Probability PART 1 - X NCERT
 
PTSP PPT.pdf
PTSP PPT.pdfPTSP PPT.pdf
PTSP PPT.pdf
 
Probability basics and bayes' theorem
Probability basics and bayes' theoremProbability basics and bayes' theorem
Probability basics and bayes' theorem
 

Dernier

What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024Stephanie Beckett
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Commit University
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxLoriGlavin3
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxLoriGlavin3
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxLoriGlavin3
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .Alan Dix
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfAddepto
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyAlfredo García Lavilla
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxPasskey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxLoriGlavin3
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenHervé Boutemy
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxLoriGlavin3
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxLoriGlavin3
 

Dernier (20)

What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptx
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdf
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easy
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxPasskey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache Maven
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
 

Probability Review

  • 1. Probability Review Tomoki Tsuchida Computational & Cognitive Neuroscience Lab Department of Computer Science University of California, San Diego
  • 2. Lectures based on previous bootcamp lectures/ slides, first written by Tim K. Marks (Thanks Tim!!) and then by David Groppe (Thanks David!!) and then by Jake Olson (Thanks Jake!!)
  • 3. Talk Outline • Why study probability? • Probability defined • Probabilities of Events Formed from other Events ‣ P(not E) ‣ P(E or F) ‣ P(E & F) - Independent events • Conditional Probability
  • 4. Why do Cognitive Scientists need probability?
  • 5. Why do Cognitive Scientists need probability? The answer: Uncertainty!
  • 6. Everyday Perceptual Uncertainty Artist: Julian Beever http://users.skynet.be/J.Beever/pave.htm
  • 7. Everyday Perceptual Uncertainty Artist: Julian Beever http://users.skynet.be/J.Beever/pave.htm
  • 9. Talk Outline • Why is probability useful? • Probability Defined • Probabilities of Events Formed from other Events ‣ P(not E) ‣ P(E or F) ‣ P(E & F) - Independent events • Conditional Probability
  • 10. We now know Probability is useful... But what does it mean? Examples: “There is a 47% chance of red.” “There is a 47% chance of rain.”
  • 11. Two of the ways to interpret the probability: (Philosophical debates ensure for either interpretation. Or, one can forget about the interpretation and focus on the properties alone, as mathematicians do :P)
  • 12. Two of the ways to interpret the probability: Frequentist-The percentage of time some event will happen. (Philosophical debates ensure for either interpretation. Or, one can forget about the interpretation and focus on the properties alone, as mathematicians do :P)
  • 13. Two of the ways to interpret the probability: Frequentist-The percentage of time some event will happen. Bayesian-How strongly you believe that something is true. (Philosophical debates ensure for either interpretation. Or, one can forget about the interpretation and focus on the properties alone, as mathematicians do :P)
  • 14. Probability based on Axiomatic Theory
  • 15. Probability based on Axiomatic Theory Random Experiment: An experiment with stochastic (nondeterministic) result (e.g. “two coin tosses”)
  • 16. Probability based on Axiomatic Theory Random Experiment: An experiment with stochastic (nondeterministic) result (e.g. “two coin tosses”) Outcome: Result of the experiment (e.g. HH, TH etc.)
  • 17. Probability based on Axiomatic Theory Random Experiment: An experiment with stochastic (nondeterministic) result (e.g. “two coin tosses”) Outcome: Result of the experiment (e.g. HH, TH etc.) Sample Space (Ω): Set of all possible outcomes of the random experiment (Ω = {HH, TH, HT, TT} )
  • 18. Probability based on Axiomatic Theory Random Experiment: An experiment with stochastic (nondeterministic) result (e.g. “two coin tosses”) Outcome: Result of the experiment (e.g. HH, TH etc.) Sample Space (Ω): Set of all possible outcomes of the random experiment (Ω = {HH, TH, HT, TT} ) Event: A subset of the sample space that satisfies certain constraints (e.g. E={HH}, F={HH, TH}, G={HH, TT})
  • 19. An Example Random Experiment: Rolling a six-sided die once Outcomes: 1, 2, 3, 4, 5, or 6 Sample Space: Ω={1,2,3,4,5,6} Events: 26 total (why?) note: The sample space itself Ω (Omega) and the empty set ϕ (phi) or {} ...are also events.
  • 20. Complement of an Event Odd numbers on the die: E={1,3,5} c The complement of E, E, are all elements that are not members of E: c E ={2,4,6} Draw Venn diagrams (Remember: events are sets! c So E is also an event itself.)
  • 21. Union of Two Events Odd numbers on the die: E={1,3,5} Numbers less than 4: F={1,2,3} The union of E and F are all elements that are members of E or members of F: E or F=E ∪ F={1,2,3,5}
  • 22. Intersection of Two Events Odd numbers on the die: E={1,3,5} Numbers less than 4: F={1,2,3} The intersection of E and F are all elements that are members of E and members of F: E & F=E ∩ F={1,3}
  • 23. An Axiomatic Definition of Probability Probability is a set function P(E) that assigns to every event E a number called the “probability of E” such that:
  • 24. An Axiomatic Definition of Probability Probability is a set function P(E) that assigns to every event E a number called the “probability of E” such that: 1. The probability of an event is greater than or equal to zero
  • 25. An Axiomatic Definition of Probability Probability is a set function P(E) that assigns to every event E a number called the “probability of E” such that: 1. The probability of an event is greater than or equal to zero P(E) ≥ 0 €
  • 26. An Axiomatic Definition of Probability Probability is a set function P(E) that assigns to every event E a number called the “probability of E” such that: 1. The probability of an event is greater than or equal to zero P(E) ≥ 0 2. The probability of the sample space is one €
  • 27. An Axiomatic Definition of Probability Probability is a set function P(E) that assigns to every event E a number called the “probability of E” such that: 1. The probability of an event is greater than or equal to zero P(E) ≥ 0 2. The probability of the sample space is one P(Ω) = 1 €
  • 28. An Axiomatic Definition of Probability Probability is a set function P(E) that assigns to every event E a number called the “probability of E” such that: 1. The probability of an event is greater than or equal to zero P(E) ≥ 0 2. The probability of the sample space is one P(Ω) = 1 3. If two events are disjoint, the probability of their union € equals the sum of their probabilities
  • 29. An Axiomatic Definition of Probability Probability is a set function P(E) that assigns to every event E a number called the “probability of E” such that: 1. The probability of an event is greater than or equal to zero P(E) ≥ 0 2. The probability of the sample space is one P(Ω) = 1 3. If two events are disjoint, the probability of their union € equals the sum of their probabilities P(E or F) = P(E) + P(F)
  • 30. An Example Random Experiment: Rolling a six-sided die once A legal probability function: P(1)=P(2)=P(3)=P(4)=P(5)=P(6)=1/6 An illegal probability function: P(1)=P(2)=P(3)=P(4)=P(5)=P(6)=1/2 P(Ω)=P(1)+P(2)+P(3)+P(4)+P(5)+P(6)=3
  • 31. Talk Outline • Probability Defined • Probabilities of Events Formed from other Events ‣ P(not E) ‣ P(E or F) ‣ P(E & F) - Independent events • Conditional Probability ‣ Bayes’ Rule
  • 32. Probability of Events from other Events: Complement of an event c P(E )=P(not E)=1-P(E) c E={1,3},E ={2,4,5,6} c P(E )=1-P(E)=1-1/3=2/3 Note: This is easy to visualize with Venn Diagrams
  • 33. Probability of Events from other Events: Intersection of two events P(E ∩ F)=P(E & F)=P(E)P(F) If and only if E and F are independent events E={1,3,5}, F={1,2,3,4} E ∩ F={1,3}
  • 34. Probability of Events from other Events: Intersection of two events P(E ∩ F)=P(E & F)=P(E)P(F) If and only if E and F are independent events E={1,3,5}, F={1,2,3,4} E ∩ F={1,3} P(E)=1/2, P(F)=2/3 P(E ∩ F)=1/3=P(E)P(F)
  • 35. Probability of Events from other Events: Union of two events P(E ∪ F)=P(E or F)=P(E)+P(F)-P(E ∩ F) E={1,3}, F={1,2,3} E ∪ F={1,2,3}, E ∩ F={1,3}
  • 36. Probability of Events from other Events: Union of two events P(E ∪ F)=P(E or F)=P(E)+P(F)-P(E ∩ F) E={1,3}, F={1,2,3} E ∪ F={1,2,3}, E ∩ F={1,3} P(E)=1/3, P(F)=1/2 P(E ∪ F)=P(E)+P(F)-P(E ∩ F)=1/3-1/2-1/3=1/2
  • 37. Probability of Events from other Events: Intersection of two events P(E ∩ F)=P(E & F) E={1,3}, F={1,2,3,4} E ∩ F={1,3} P(E)=2/6, P(F)=4/6 P(E ∩ F)=2/6
  • 38. Probability of Events from other Events: Intersection of two events P(E ∩ F)=P(E)P(F) If and only if E and F are independent events E={1,3}, F={1,2,3,4} E ∩ F={1,3} P(E)=2/6, P(F)=4/6 P(E ∩ F)=2/6≠P(E)P(F)
  • 39. Understanding Independence P(E ∩ F)=P(E)P(F) Independent events give no information about each other. E={1,3,5}, F={1,2,3,4} E ∩ F={1,3} P(E)=1/2, P(F)=2/3, P(E & F)=1/3 P(E)P(F)=(1/2)/(2/3)=1/3=P(E & F) note: this is different from disjoint events, for which P(E)P(F) = 0 !
  • 40. Talk Outline • Probability Defined • Probabilities of Events Formed from other Events ‣ P(not E) ‣ P(E or F) ‣ P(E & F) - Independent events • Conditional Probability ‣ Bayes’ Rule
  • 41. If I tell you that I rolled a number less than 4, what is the probability that I rolled an odd number? Conditional Probability: The probability of event E given that event F has happened P(E & F) P(E | F) = P(F) E={1,3,5}, F={1,2,3} € E ∩ F={1,3}
  • 42. If I tell you that I rolled a number less than 4, what is the probability that I rolled an odd number? Conditional Probability: The probability of event E given that event F has happened P(E & F) P(E | F) = P(F) E={1,3,5}, F={1,2,3} € E ∩ F={1,3} P(E & F)=1/3, P(F)=1/2 P(E|F)=(1/3)/(1/2)=2/3
  • 43. Conditional Probability What if E and F are independent events? P(E & F) P(E)P(F) P(E | F) = = = P(E) P(F) P(F) € same example of independent variables that I used before
  • 44. Conditional Probability What if E and F are independent events? P(E & F) P(E)P(F) P(E | F) = = = P(E) P(F) P(F) If I tell you that I rolled a number less than 5, what is the probability that I rolled an odd number? € E={1,3,5}, F={1,2,3,4} same example of E ∩ F={1,3} independent variables that I used before P(E)=1/2, P(F)=2/3, P(E & F)=1/3 P(E | F)=(1/3)/(2/3)=1/2=P(E)
  • 45. The multiplication rule Rearranging the definition of conditional probability, we get: P (E&F ) = P (E|F )P (F )
  • 46. The multiplication rule Rearranging the definition of conditional probability, we get: P (E&F ) = P (E|F )P (F ) If we have more events, say E, F, G... (and change & to , for simpler notation):
  • 47. The multiplication rule Rearranging the definition of conditional probability, we get: P (E&F ) = P (E|F )P (F ) If we have more events, say E, F, G... (and change & to , for simpler notation): P (E, F, G) = P (E, F |G)P (G) = P (E|F, G)P (F |G)P (G)
  • 48. The multiplication rule Rearranging the definition of conditional probability, we get: P (E&F ) = P (E|F )P (F ) If we have more events, say E, F, G... (and change & to , for simpler notation): P (E, F, G) = P (E, F |G)P (G) = P (E|F, G)P (F |G)P (G) If you forget everything else today about conditional probability, just remember:
  • 49. The multiplication rule Rearranging the definition of conditional probability, we get: P (E&F ) = P (E|F )P (F ) If we have more events, say E, F, G... (and change & to , for simpler notation): P (E, F, G) = P (E, F |G)P (G) = P (E|F, G)P (F |G)P (G) If you forget everything else today about conditional probability, just remember: P (EF ) = P (E|F )P (F )
  • 50. The Wrong Conditioning Variable: “The CASA (Center for Addiction and Substance Abuse at Columbia) study establishes a clear progression that begins with gateway drugs and leads to cocaine use: nearly 90% of people who have ever tried cocaine used all three gateway substances [alcohol, marijuana, & cigarettes] first.” Source: http://www.columbia.edu/cu/record/archives/vol20/ vol20_iss10/record2010.24.html 29
  • 51. Extra: Monty Hall problem Suppose you're on a game show and you're given the choice of three doors [and will win what is behind the chosen door]. Behind one door is a car; behind the others, goats. The car and the goats were placed randomly behind the doors before the show. The rules of the game show are as follows: After you have chosen a door, the door remains closed for the time being. The game show host, Monty Hall, who knows what is behind the doors, now has to open one of the two remaining doors, and the door he opens must have a goat behind it. If both remaining doors have goats behind them, he chooses one [uniformly] at random. After Monty Hall opens a door with a goat, he will ask you to decide whether you want to stay with your first choice or to switch to the last remaining door. Imagine that you chose Door 1 and the host opens Door 3, which has a goat. He then asks you "Do you want to switch to Door Number 2?" Is it to your advantage to change your choice? - Krauss and Wang 2003:10 30
  • 52. Talk Outline • Probability Defined • Probabilities of Events Formed from other Events ‣ P(not E) ‣ P(E or F) ‣ P(E & F) - Independent events • Conditional Probability ‣ Bayes’ Rule
  • 53. Bayes’ Rule (a.k.a. Bayes’ Theorem) Note that the multiplication rule is symmetric, so P (E, F ) = P (E|F )P (F ) = P (F |E)P (E) Dividing one of them by P(F) yields P (F |E)P (E) P (E|F ) = P (F )
  • 54. Bayes’ Rule (a.k.a. Bayes’ Theorem) A way to infer one conditional probability from another (probabilistic inference): P (D|H)P (H) P (H|D) = P (D) Prior: P (H) (“prior” belief about H) Likelihood: P (D|H) (Under, H how likely it is to see D) Posterior: P (H|D) (“updated” belief about H after seeing D) Note: Named after Rev. Thomas Bayes (1702-1761)
  • 55. Marginalization How do we calculate P(F)? Note that { E = (E F ) [ (E F ) So that P (E) = P (EF ) + P (EF { ) { { = P (E|F )P (F ) + P (E|F )P (F ) = P (E|F )P (F ) + P (E|F { )(1 P (F ))
  • 56. Marginalization So for Bayes’ rule for two hypotheses is P (D|H)P (H) P (H|D) = P (D|H)P (H) + P (D|H { )P (H { ) (phew!)
  • 57. Exercise • A laboratory blood test is 95% effective in detecting a disease when it is, in fact, present. However, the test also yields a “false positive” result for 1 percent of the healthy persons tested. If 0.5 percent of the population actually has the disease, what is the probability a person has the disease given that the test result is positive? (from Ross, Section 3.3 example 3d)
  • 58. Solution • D: the event that the tested person has the disease. • E: the event that the test result is positive. • We know: P(E | D) = 0.95, P(D) = .005. • We want to know P(D | E)
  • 59. Solution P (D, E) P (D|E) = P (E) P (E|D)P (D) = P (E|D)P (D) + P (E|D{ )P (D{ ) (.95)(.005) = (.95)(.005) + (.01)(.995) ⇡ .323 (Why is it so low?)
  • 60. Does the Brain Perform Bayesian Inference? Opinion TRENDS in Neurosciences Vol.27 No.12 December 2004 The Bayesian brain: the role of uncertainty in neural coding and computation David C. Knill and Alexandre Pouget Center for Visual Science and the Department of Brain and Cognitive Science, University of Rochester, NY 14627, USA To use sensory information efficiently to make judgments Bayesian inference and the Bayesian coding hypothesis and guide action in the world, the brain must represent The fundamental concept behind the Bayesian approach and use information about uncertainty in its computations to perceptual computations is that the information for perception and action. Bayesian methods have proven provided by a set of sensory data about the world is successful in building computational theories for percep- represented by a conditional probability density function tion and sensorimotor control, and psychophysics is over the set of unknown variables – the posterior density providing a growing body of evidence that human function. A Bayesian perceptual system, therefore, would perceptual computations are ‘Bayes’ optimal’. This leads represent the perceived depth of an object, for example, to the ‘Bayesian coding hypothesis’: that the brain not as a single number Z but as a conditional probability represents sensory information probabilistically, in the density function p(Z/I), where I is the available image form of probability distributions. Several computational information (e.g. stereo disparities). Loosely speaking, schemes have recently been proposed for how this might p(Z/I) would specify the relative probability that the object be achieved in populations of neurons. Neurophysio- is at different depths Z, given the available sensory logical data on the hypothesis, however, is almost non- information. existent. A major challenge for neuroscientists is to test More generally, the component computations that these ideas experimentally, and so determine whether underlay Bayesian inferences [that give rise to p(Z/I)] and how neurons code information about sensory are ideally performed on representations of conditional uncertainty. probability density functions rather than on unitary estimates of parameter values. Loosely speaking, a Humans and other animals operate in a world of sensory Bayes’ optimal system maintains, at each stage of local uncertainty. Although introspection tells us that percep- computation, a representation of all possible values of the
  • 61. Does the Brain Perform Bayesian Inference? • An appealing idea because: ‣ It could explain why the brain works as it does (i.e., it is performing optimal Bayesian inference) • An unappealing idea because: ‣ For even some simple problems it can be difficult to know what the optimal Bayesian inference is ‣ Computation of probabilities can be difficult
  • 62. Talk Outline ‣ Next session: Random Variables etc.

Notes de l'éditeur

  1. \n
  2. \n
  3. \n
  4. Probability is powerful tool for dealing with uncertainty\n\nImportant to Cog Sci because:\n1) living in the world is fraught with uncertainty\n2) Scientific data is noisy - \n A) Behavior and neural representations are noisy.\n\n B) our scientific perceptual systems are fraught with uncertainty\n\n
  5. -Principled way of dealing with the uncertainty in our perception, experience in general.\n- - Maybe an idea as to how the brain deals with things?\n\nImportant to Cog Sci because:\n1) living in the world is fraught with uncertainty\n(e.g., any 2D image is consistent with multiple 3D scenes)\n\nUnderstanding the mathematics of probability may help us to understand how the brain deals with uncertainty\n
  6. \n
  7. Important to Cog Sci because:\n2) our scientific perceptual systems are fraught with uncertainty\n\nProbability is key to making sense of our data.\n
  8. \n
  9. \n
  10. Examples:\nfrequentist:\nchance of winning\ngambles, repeatable stuff (but a lot of things are not repeatable! wouldn’t you get same results from the same initial conditions?)\n\nBayesian: \nweather\nexistence of aliens on Saturn\nsubjective belief must follow prob. laws in order to be coherent.\n(but how do you measure the belief? and priors?)\n\n
  11. Examples:\nfrequentist:\nchance of winning\ngambles, repeatable stuff (but a lot of things are not repeatable! wouldn’t you get same results from the same initial conditions?)\n\nBayesian: \nweather\nexistence of aliens on Saturn\nsubjective belief must follow prob. laws in order to be coherent.\n(but how do you measure the belief? and priors?)\n\n
  12. Stop, do some examples of sample spaces and events.\n\nTo build a probability model, we need at least three ingredients. We need to know: • What are all the things that could possibly happen? • What sensible yes-no questions can we ask about these things? • For any such question, what is the probability that the answer is yes?\nThe first point on the agenda is formalized by specifying a set Ω. Every element ω ∈ Ω symbolizes one possible fate of the model.\n
  13. Stop, do some examples of sample spaces and events.\n\nTo build a probability model, we need at least three ingredients. We need to know: • What are all the things that could possibly happen? • What sensible yes-no questions can we ask about these things? • For any such question, what is the probability that the answer is yes?\nThe first point on the agenda is formalized by specifying a set Ω. Every element ω ∈ Ω symbolizes one possible fate of the model.\n
  14. Stop, do some examples of sample spaces and events.\n\nTo build a probability model, we need at least three ingredients. We need to know: • What are all the things that could possibly happen? • What sensible yes-no questions can we ask about these things? • For any such question, what is the probability that the answer is yes?\nThe first point on the agenda is formalized by specifying a set Ω. Every element ω ∈ Ω symbolizes one possible fate of the model.\n
  15. Stop, do some examples of sample spaces and events.\n\nTo build a probability model, we need at least three ingredients. We need to know: • What are all the things that could possibly happen? • What sensible yes-no questions can we ask about these things? • For any such question, what is the probability that the answer is yes?\nThe first point on the agenda is formalized by specifying a set Ω. Every element ω ∈ Ω symbolizes one possible fate of the model.\n
  16. \n
  17. \n
  18. \n
  19. \n
  20. \n
  21. \n
  22. \n
  23. \n
  24. \n
  25. \n
  26. \n
  27. \n
  28. \n
  29. \n
  30. \n
  31. \n
  32. \n
  33. \n
  34. \n
  35. draw Venn diagram\n
  36. \n
  37. \n
  38. \n
  39. \n
  40. \n
  41. \n
  42. \n
  43. \n
  44. \n
  45. \n
  46. \n
  47. \n
  48. \n
  49. \n
  50. \n
  51. \n
  52. \n
  53. \n