SlideShare une entreprise Scribd logo
1  sur  29
Télécharger pour lire hors ligne
Bayesian Decision Theory
Prof. Dr. Mostafa Gadal-Haqq
Faculty of Computer & Information Sciences
Computer Science Department
AIN SHAMS UNIVERSITY
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 1
CSC446 : Pattern Recognition
(Pattern Classifications, Ch2: Sec. 2.1 to Sec. 2.3)
2.1 Bayesian Decision Theory
• Bayesian Decision Theory is based on
quantifying the trade-offs between various
classification decisions using probabilities
and the costs that accompany such decisions.
• Assumes that: The decision problem is posed
in probabilistic terms and that all of the
relevant probability values are known.
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 2
2.1 Bayesian Decision Theory
• Back to the Fish Sorting Machine:
–  = a random variable (State of nature)={1 ,2}
• For example: 1 = Sea bass, and 2 = Salmon
• P(1 ) = the prior (a priori probability) that the
coming fish is sea bass.
• P(2 ) = the prior (a priori probability) that the
coming fish is salmon.
– The priors gives us the knowledge of how likely
we are to get salmon or Sea bass before the fish
actually appears.
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 3
• Decision Rule Using Priors only:
– to make a decision about the fish that will
appear using only the priors, P(1) and P(2),
We use the following decision rule:
– which minimize the error.
2.1 Bayesian Decision Theory
Decide fish 1 if P(1) > P(2)
and fish 2 if P(1) < P(2)
Probability of error = min [ P(1) , P(2)]
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 4
• That is:
– If P(1) >> P(2) we will be right most of the
time when we decide that the fish belong to 1 .
– If P(1) = P(2) we have only fifty-fifty chance
of being right.
– Under these conditions, no other decision rules
can yield a larger probability of being right.
2.1 Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 5
• Improving the decision using observation:
2.1 Bayesian Decision Theory
• If we know the class –
conditional probability,
P(x | j), of an
observation x, we could
improve our decision.
• for example: x describes
the observed lightness of
the sea bass or salmon
P(x|w2)
P(x|w1)
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 6
• We can improve our decision by using this
observed feature and the Bayes rule :
– Posterior = (Likelihood x Prior) / Evidence
– Where, for C categories :




Cj
j
jj
PxPxP
1
)()|()( 
2.1 Bayesian Decision Theory
)(
)()|(
)|(
xP
PxP
xP jj
j

 
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 7
• Bayesian decision is based on minimizing the
probability of error , i.e. for a given feature
value x :
• The probability of error for a particular x is :
2.1 Bayesian Decision Theory
Decide x 1 if P(1 | x) > P(2 | x)
and x 2 if P(1 | x) < P(2 | x)
P(error | x) = min [ P(1 | x), P(2 | x) ]
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 8
fish (x)  2
Suppose P(1)=2/3=0.67, and P(2)=1/3= 0.33 ,
2.1 Bayesian Decision Theory: Numerical Example
P(x|w2)
P(x|w1)
0.36
0.15
If x = 11.5, then P(x|1)= 0.15 , P(x|2)= 0.36
P(x) = 0.15*0.67 + 0.36*0.33 = 0.22
P(1|x)= 0.15*0.67/0.22
= 0.46
P(2|x)= 0.36*0.33/0.22
= 0.54
fish  1
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 9
2.1 Bayesian Decision Theory
Computing
for all values
of x gives
decision
regions
(Rules) :
R2 R2R1 R1
• if x  R1
decide 1
• if x  R2
decide 2
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 10
• Draw Probability Densities and find the
decision regions for the following Classes:
 = {1, 2},
P(x | 1) ~ N(20, 4),
P(x | 2) ~ N(15, 2),
P(1) = 1/3, and P(2) = 2/3,
– Then Classify a sample with feature value x= 17.
Assignment 2.1
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 11
2.2 General Bayesian Decision Theory
• Generalization of Bayesian decision theory is
done by allowing the following:
– Having more than one feature.
– Having more than two states of nature.
– Allowing actions and not only decide on the
state of nature.
– Introduce a loss of function which is more
general than the probability of error.
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 12
• Allowing actions other than classification
primarily allows the possibility of rejection
• Rejection is refusing to make decision in close
or bad cases!
• The loss function states: how costly each
action taken is?
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 13
• Suppose we have c states of nature (categories)
 = { 1, 2,…, c } ,
• a feature vector:
x = { x1, x2,…, xd } ,
• the possible actions
 = { 1, 2,…, a } ,
• and the loss, (i | j ), incurred for taking
action i when the state of nature is j .
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 14
• The conditional risk, R(i | x), for select the action
i is given by:




cj
j
jjii xPxR
1
)|()|()|( 
• The Overall risk, R, is the Sum of all Conditional
risks R(i | x) for i = 1,…,a.




ai
i
i xRR
1
)|(
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 15
Take action i (i.e. decide i)
if R(i | x) < R(j | x) ;  j and j  i.
The Bayesian decision rule becomes: select
the action i for which the conditional risk,
R(i | x), is minimum. That is :
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 16
• Minimizing R(i | x) for all actions, that is: for
all i ; i = 1,…, a, is minimizing R.
• The overall risk R is the “expected loss
associated with a given decision rule”.
• The overall risk R is called the Bayes risk,
which defines the best performance that can
be achieved!
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 17
• Two-category classification Example:
Suppose we have two categories {1 ,2} and two
actions {1 ,2 }, where:
1 : deciding 1 , and 2 : deciding 2 ,
and for simplicity we write ij = (i | j )
The conditional risks for taking 1 and 2 are:
R(1 | x) = 11P(1 | x) + 12P(2 | x)
R(2 | x) = 21P(1 | x) + 22P(2 | x)
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 18
decide 1 (i.e. 1) if R(1 | x) < R(2 | x)
and 2 (i.e. 2) if R(1 | x) > R(2 | x)
There are a variety of ways to express the
minimum-risk rule, each has its advantage:
1- The fundamental rule is:
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 19
2- The rule in terms of the posteriors is:
3- The rule in terms of the priors and conditional
densities is:
decide 1 if (21- 11) P(1 | x ) > (12- 22) P(2 | x )
decide 2 otherwise
2.2 General Bayesian Decision Theory
decide 1 if
(21- 11) P(x | 1 ) P(1) > (12- 22) P(x | 2 ) P(2 )
decide 2 otherwise
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 20
4- The rule in terms of the likelihoods ratios:
That is, the Bayes (Optimal) decision can be
interpreted as:
2.2 General Bayesian Decision Theory
decide 1 if
decide 2 otherwise
)(
)(
.
)|(
)|(
1
2
1121
2212
2
1






P
P
xp
xp



“One can take an optimal decision, if the
likelihood ratio exceeds a threshold value
that is independent of the observation x”
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 21
• Decision regions depends on the values of the loss
function:
• For different loss function  we have:
)(
)(2
then
01
20
if
)(
)(
then
01
10
if
1
2
1
2






P
P
P
P
b
a






















 



)|(
)|(
:ifdecidethen
)(
)(
.Let
2
1
1
1
2
1121
2212
xp
xp
P
P
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 22
2.2 General Bayesian Decision Theory
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 23
2.3 Minimum-Error Rate Classification
• Consider the zero-one (or symmetrical) loss
function:
• Therefore, the conditional risk is:
• In other words, for symmetric loss function, the
conditional risk is the probability of error.
cji
ji
ji
ji
,...,1,
1
0
),( 






 
 



1j
ij
cj
1j
jjii
)x|(P1)x|(P
)x|(P)|()x|(R
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 24
The Minmax Criterion
• Sometimes we need to design our classifier to
perform well over a range of prior probabilities, or
where we do not know the prior probabilities.
• A reasonable approach is to design our classifier so
that the worst overall risk for any value of the
priors is as small as possible
• Minimax Criterion:
“minimize the maximum possible overall
risk”
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 25
The Minmax Criterion
• It is found that the overall risk is linear in P(ωj).
Then, when the constant of proportionality (the
slope) is zero, the risk is independent of priors. This
condition gives the minmax risk Rmm as:
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 26
The Minmax Criterion
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 27
The Neyman-Pearson Criterion
• The Neynam-Pearson Criterion:
“minimize the overall risk subject to a
constraint”
• Generally Neyman-Pearson criterion is satisfied by
adjusting decision boundaries numerically.
However, for Gaussian and some other
distributions, its solution can be found analytically.
 R(αi|x) dx < constant
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 28
• Computer Exercises:
– Find the optimal decision for the following data:
 = {1, 2},
p(x | 1) ~ N(20, 4),
p(x | 2) ~ N(15, 2),
P(1) = 2/3, and P(2) = 1/3,
– With a loss function:
– Then classify the samples: x = 12, 17, 18, and 20.







12
.511

Assignment 2.2
ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 29

Contenu connexe

Tendances

Bayseian decision theory
Bayseian decision theoryBayseian decision theory
Bayseian decision theorysia16
 
Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptronomaraldabash
 
2.5 backpropagation
2.5 backpropagation2.5 backpropagation
2.5 backpropagationKrish_ver2
 
Computer Vision: Feature matching with RANSAC Algorithm
Computer Vision: Feature matching with RANSAC AlgorithmComputer Vision: Feature matching with RANSAC Algorithm
Computer Vision: Feature matching with RANSAC Algorithmallyn joy calcaben
 
Naïve Bayes Classifier Algorithm.pptx
Naïve Bayes Classifier Algorithm.pptxNaïve Bayes Classifier Algorithm.pptx
Naïve Bayes Classifier Algorithm.pptxShubham Jaybhaye
 
Belief Networks & Bayesian Classification
Belief Networks & Bayesian ClassificationBelief Networks & Bayesian Classification
Belief Networks & Bayesian ClassificationAdnan Masood
 
Loss Functions for Deep Learning - Javier Ruiz Hidalgo - UPC Barcelona 2018
Loss Functions for Deep Learning - Javier Ruiz Hidalgo - UPC Barcelona 2018Loss Functions for Deep Learning - Javier Ruiz Hidalgo - UPC Barcelona 2018
Loss Functions for Deep Learning - Javier Ruiz Hidalgo - UPC Barcelona 2018Universitat Politècnica de Catalunya
 
Activation functions
Activation functionsActivation functions
Activation functionsPRATEEK SAHU
 
Statistical Pattern recognition(1)
Statistical Pattern recognition(1)Statistical Pattern recognition(1)
Statistical Pattern recognition(1)Syed Atif Naseem
 
Bayes Classification
Bayes ClassificationBayes Classification
Bayes Classificationsathish sak
 
Markov Chain Monte Carlo Methods
Markov Chain Monte Carlo MethodsMarkov Chain Monte Carlo Methods
Markov Chain Monte Carlo MethodsFrancesco Casalegno
 
Neural Networks: Radial Bases Functions (RBF)
Neural Networks: Radial Bases Functions (RBF)Neural Networks: Radial Bases Functions (RBF)
Neural Networks: Radial Bases Functions (RBF)Mostafa G. M. Mostafa
 
Object detection and Instance Segmentation
Object detection and Instance SegmentationObject detection and Instance Segmentation
Object detection and Instance SegmentationHichem Felouat
 
Autoencoders
AutoencodersAutoencoders
AutoencodersCloudxLab
 

Tendances (20)

Bayseian decision theory
Bayseian decision theoryBayseian decision theory
Bayseian decision theory
 
Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptron
 
2.5 backpropagation
2.5 backpropagation2.5 backpropagation
2.5 backpropagation
 
Computer Vision: Feature matching with RANSAC Algorithm
Computer Vision: Feature matching with RANSAC AlgorithmComputer Vision: Feature matching with RANSAC Algorithm
Computer Vision: Feature matching with RANSAC Algorithm
 
CSC446: Pattern Recognition (LN3)
CSC446: Pattern Recognition (LN3)CSC446: Pattern Recognition (LN3)
CSC446: Pattern Recognition (LN3)
 
Naïve Bayes Classifier Algorithm.pptx
Naïve Bayes Classifier Algorithm.pptxNaïve Bayes Classifier Algorithm.pptx
Naïve Bayes Classifier Algorithm.pptx
 
Neural Networks: Introducton
Neural Networks: IntroductonNeural Networks: Introducton
Neural Networks: Introducton
 
Belief Networks & Bayesian Classification
Belief Networks & Bayesian ClassificationBelief Networks & Bayesian Classification
Belief Networks & Bayesian Classification
 
Loss Functions for Deep Learning - Javier Ruiz Hidalgo - UPC Barcelona 2018
Loss Functions for Deep Learning - Javier Ruiz Hidalgo - UPC Barcelona 2018Loss Functions for Deep Learning - Javier Ruiz Hidalgo - UPC Barcelona 2018
Loss Functions for Deep Learning - Javier Ruiz Hidalgo - UPC Barcelona 2018
 
Activation functions
Activation functionsActivation functions
Activation functions
 
Statistical Pattern recognition(1)
Statistical Pattern recognition(1)Statistical Pattern recognition(1)
Statistical Pattern recognition(1)
 
Bayes Classification
Bayes ClassificationBayes Classification
Bayes Classification
 
Hidden markov model ppt
Hidden markov model pptHidden markov model ppt
Hidden markov model ppt
 
Markov Chain Monte Carlo Methods
Markov Chain Monte Carlo MethodsMarkov Chain Monte Carlo Methods
Markov Chain Monte Carlo Methods
 
Neural Networks: Radial Bases Functions (RBF)
Neural Networks: Radial Bases Functions (RBF)Neural Networks: Radial Bases Functions (RBF)
Neural Networks: Radial Bases Functions (RBF)
 
Object detection and Instance Segmentation
Object detection and Instance SegmentationObject detection and Instance Segmentation
Object detection and Instance Segmentation
 
Autoencoders
AutoencodersAutoencoders
Autoencoders
 
Naive bayes
Naive bayesNaive bayes
Naive bayes
 
Perceptron
PerceptronPerceptron
Perceptron
 
Naive Bayes
Naive BayesNaive Bayes
Naive Bayes
 

Similaire à Bay Area Real Estate Guide

2012 mdsp pr07 bayes decision
2012 mdsp pr07 bayes decision2012 mdsp pr07 bayes decision
2012 mdsp pr07 bayes decisionnozomuhamada
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsChristian Robert
 
Artificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Artificial Neural Network Lect4 : Single Layer Perceptron ClassifiersArtificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Artificial Neural Network Lect4 : Single Layer Perceptron ClassifiersMohammed Bennamoun
 
Probabilistic Control of Switched Linear Systems with Chance Constraints
Probabilistic Control of Switched Linear Systems with Chance ConstraintsProbabilistic Control of Switched Linear Systems with Chance Constraints
Probabilistic Control of Switched Linear Systems with Chance ConstraintsLeo Asselborn
 
MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1arogozhnikov
 
Some Studies on Multistage Decision Making Under Fuzzy Dynamic Programming
Some Studies on Multistage Decision Making Under Fuzzy Dynamic ProgrammingSome Studies on Multistage Decision Making Under Fuzzy Dynamic Programming
Some Studies on Multistage Decision Making Under Fuzzy Dynamic ProgrammingWaqas Tariq
 
Options Portfolio Selection
Options Portfolio SelectionOptions Portfolio Selection
Options Portfolio Selectionguasoni
 
Bayesian decesion theory
Bayesian decesion theoryBayesian decesion theory
Bayesian decesion theoryVARUN KUMAR
 
Estimation and Prediction of Complex Systems: Progress in Weather and Climate
Estimation and Prediction of Complex Systems: Progress in Weather and ClimateEstimation and Prediction of Complex Systems: Progress in Weather and Climate
Estimation and Prediction of Complex Systems: Progress in Weather and Climatemodons
 
Neural Networks: Support Vector machines
Neural Networks: Support Vector machinesNeural Networks: Support Vector machines
Neural Networks: Support Vector machinesMostafa G. M. Mostafa
 
IVR - Chapter 5 - Bayesian methods
IVR - Chapter 5 - Bayesian methodsIVR - Chapter 5 - Bayesian methods
IVR - Chapter 5 - Bayesian methodsCharles Deledalle
 
Interval Type-2 fuzzy decision making
Interval Type-2 fuzzy decision makingInterval Type-2 fuzzy decision making
Interval Type-2 fuzzy decision makingBob John
 

Similaire à Bay Area Real Estate Guide (20)

2012 mdsp pr07 bayes decision
2012 mdsp pr07 bayes decision2012 mdsp pr07 bayes decision
2012 mdsp pr07 bayes decision
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
 
Artificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Artificial Neural Network Lect4 : Single Layer Perceptron ClassifiersArtificial Neural Network Lect4 : Single Layer Perceptron Classifiers
Artificial Neural Network Lect4 : Single Layer Perceptron Classifiers
 
Probabilistic Control of Switched Linear Systems with Chance Constraints
Probabilistic Control of Switched Linear Systems with Chance ConstraintsProbabilistic Control of Switched Linear Systems with Chance Constraints
Probabilistic Control of Switched Linear Systems with Chance Constraints
 
MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1
 
Some Studies on Multistage Decision Making Under Fuzzy Dynamic Programming
Some Studies on Multistage Decision Making Under Fuzzy Dynamic ProgrammingSome Studies on Multistage Decision Making Under Fuzzy Dynamic Programming
Some Studies on Multistage Decision Making Under Fuzzy Dynamic Programming
 
Options Portfolio Selection
Options Portfolio SelectionOptions Portfolio Selection
Options Portfolio Selection
 
Slides univ-van-amsterdam
Slides univ-van-amsterdamSlides univ-van-amsterdam
Slides univ-van-amsterdam
 
Bayesian decesion theory
Bayesian decesion theoryBayesian decesion theory
Bayesian decesion theory
 
HPWFcorePRES--FUR2016
HPWFcorePRES--FUR2016HPWFcorePRES--FUR2016
HPWFcorePRES--FUR2016
 
Estimation and Prediction of Complex Systems: Progress in Weather and Climate
Estimation and Prediction of Complex Systems: Progress in Weather and ClimateEstimation and Prediction of Complex Systems: Progress in Weather and Climate
Estimation and Prediction of Complex Systems: Progress in Weather and Climate
 
pattern recognition
pattern recognition pattern recognition
pattern recognition
 
CSC446: Pattern Recognition (LN8)
CSC446: Pattern Recognition (LN8)CSC446: Pattern Recognition (LN8)
CSC446: Pattern Recognition (LN8)
 
2019 PMED Spring Course - Multiple Decision Treatment Regimes: Fundamentals -...
2019 PMED Spring Course - Multiple Decision Treatment Regimes: Fundamentals -...2019 PMED Spring Course - Multiple Decision Treatment Regimes: Fundamentals -...
2019 PMED Spring Course - Multiple Decision Treatment Regimes: Fundamentals -...
 
Csc446: Pattern Recognition
Csc446: Pattern Recognition Csc446: Pattern Recognition
Csc446: Pattern Recognition
 
Neural Networks: Support Vector machines
Neural Networks: Support Vector machinesNeural Networks: Support Vector machines
Neural Networks: Support Vector machines
 
Lec37
Lec37Lec37
Lec37
 
IVR - Chapter 5 - Bayesian methods
IVR - Chapter 5 - Bayesian methodsIVR - Chapter 5 - Bayesian methods
IVR - Chapter 5 - Bayesian methods
 
Bayes ML.ppt
Bayes ML.pptBayes ML.ppt
Bayes ML.ppt
 
Interval Type-2 fuzzy decision making
Interval Type-2 fuzzy decision makingInterval Type-2 fuzzy decision making
Interval Type-2 fuzzy decision making
 

Plus de Mostafa G. M. Mostafa

Digital Image Processing: Image Restoration
Digital Image Processing: Image RestorationDigital Image Processing: Image Restoration
Digital Image Processing: Image RestorationMostafa G. M. Mostafa
 
Digital Image Processing: Image Segmentation
Digital Image Processing: Image SegmentationDigital Image Processing: Image Segmentation
Digital Image Processing: Image SegmentationMostafa G. M. Mostafa
 
Digital Image Processing: Image Enhancement in the Spatial Domain
Digital Image Processing: Image Enhancement in the Spatial DomainDigital Image Processing: Image Enhancement in the Spatial Domain
Digital Image Processing: Image Enhancement in the Spatial DomainMostafa G. M. Mostafa
 
Digital Image Processing: Image Enhancement in the Frequency Domain
Digital Image Processing: Image Enhancement in the Frequency DomainDigital Image Processing: Image Enhancement in the Frequency Domain
Digital Image Processing: Image Enhancement in the Frequency DomainMostafa G. M. Mostafa
 
Digital Image Processing: Digital Image Fundamentals
Digital Image Processing: Digital Image FundamentalsDigital Image Processing: Digital Image Fundamentals
Digital Image Processing: Digital Image FundamentalsMostafa G. M. Mostafa
 
Digital Image Processing: An Introduction
Digital Image Processing: An IntroductionDigital Image Processing: An Introduction
Digital Image Processing: An IntroductionMostafa G. M. Mostafa
 
Neural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) AlgorithmNeural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) AlgorithmMostafa G. M. Mostafa
 
Neural Networks: Rosenblatt's Perceptron
Neural Networks: Rosenblatt's PerceptronNeural Networks: Rosenblatt's Perceptron
Neural Networks: Rosenblatt's PerceptronMostafa G. M. Mostafa
 
Neural Networks: Model Building Through Linear Regression
Neural Networks: Model Building Through Linear RegressionNeural Networks: Model Building Through Linear Regression
Neural Networks: Model Building Through Linear RegressionMostafa G. M. Mostafa
 
Neural Networks: Multilayer Perceptron
Neural Networks: Multilayer PerceptronNeural Networks: Multilayer Perceptron
Neural Networks: Multilayer PerceptronMostafa G. M. Mostafa
 
Neural Networks: Self-Organizing Maps (SOM)
Neural Networks:  Self-Organizing Maps (SOM)Neural Networks:  Self-Organizing Maps (SOM)
Neural Networks: Self-Organizing Maps (SOM)Mostafa G. M. Mostafa
 
Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)Mostafa G. M. Mostafa
 

Plus de Mostafa G. M. Mostafa (13)

Csc446: Pattren Recognition
Csc446: Pattren RecognitionCsc446: Pattren Recognition
Csc446: Pattren Recognition
 
Digital Image Processing: Image Restoration
Digital Image Processing: Image RestorationDigital Image Processing: Image Restoration
Digital Image Processing: Image Restoration
 
Digital Image Processing: Image Segmentation
Digital Image Processing: Image SegmentationDigital Image Processing: Image Segmentation
Digital Image Processing: Image Segmentation
 
Digital Image Processing: Image Enhancement in the Spatial Domain
Digital Image Processing: Image Enhancement in the Spatial DomainDigital Image Processing: Image Enhancement in the Spatial Domain
Digital Image Processing: Image Enhancement in the Spatial Domain
 
Digital Image Processing: Image Enhancement in the Frequency Domain
Digital Image Processing: Image Enhancement in the Frequency DomainDigital Image Processing: Image Enhancement in the Frequency Domain
Digital Image Processing: Image Enhancement in the Frequency Domain
 
Digital Image Processing: Digital Image Fundamentals
Digital Image Processing: Digital Image FundamentalsDigital Image Processing: Digital Image Fundamentals
Digital Image Processing: Digital Image Fundamentals
 
Digital Image Processing: An Introduction
Digital Image Processing: An IntroductionDigital Image Processing: An Introduction
Digital Image Processing: An Introduction
 
Neural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) AlgorithmNeural Networks: Least Mean Square (LSM) Algorithm
Neural Networks: Least Mean Square (LSM) Algorithm
 
Neural Networks: Rosenblatt's Perceptron
Neural Networks: Rosenblatt's PerceptronNeural Networks: Rosenblatt's Perceptron
Neural Networks: Rosenblatt's Perceptron
 
Neural Networks: Model Building Through Linear Regression
Neural Networks: Model Building Through Linear RegressionNeural Networks: Model Building Through Linear Regression
Neural Networks: Model Building Through Linear Regression
 
Neural Networks: Multilayer Perceptron
Neural Networks: Multilayer PerceptronNeural Networks: Multilayer Perceptron
Neural Networks: Multilayer Perceptron
 
Neural Networks: Self-Organizing Maps (SOM)
Neural Networks:  Self-Organizing Maps (SOM)Neural Networks:  Self-Organizing Maps (SOM)
Neural Networks: Self-Organizing Maps (SOM)
 
Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)Neural Networks: Principal Component Analysis (PCA)
Neural Networks: Principal Component Analysis (PCA)
 

Dernier

NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...
NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...
NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...Amil Baba Dawood bangali
 
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort servicejennyeacort
 
Data Analysis Project : Targeting the Right Customers, Presentation on Bank M...
Data Analysis Project : Targeting the Right Customers, Presentation on Bank M...Data Analysis Project : Targeting the Right Customers, Presentation on Bank M...
Data Analysis Project : Targeting the Right Customers, Presentation on Bank M...Boston Institute of Analytics
 
ASML's Taxonomy Adventure by Daniel Canter
ASML's Taxonomy Adventure by Daniel CanterASML's Taxonomy Adventure by Daniel Canter
ASML's Taxonomy Adventure by Daniel Cantervoginip
 
Conf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
Conf42-LLM_Adding Generative AI to Real-Time Streaming PipelinesConf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
Conf42-LLM_Adding Generative AI to Real-Time Streaming PipelinesTimothy Spann
 
Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024Colleen Farrelly
 
Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...Seán Kennedy
 
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhh
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhhThiophen Mechanism khhjjjjjjjhhhhhhhhhhh
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhhYasamin16
 
GA4 Without Cookies [Measure Camp AMS]
GA4 Without Cookies [Measure Camp AMS]GA4 Without Cookies [Measure Camp AMS]
GA4 Without Cookies [Measure Camp AMS]📊 Markus Baersch
 
Data Factory in Microsoft Fabric (MsBIP #82)
Data Factory in Microsoft Fabric (MsBIP #82)Data Factory in Microsoft Fabric (MsBIP #82)
Data Factory in Microsoft Fabric (MsBIP #82)Cathrine Wilhelmsen
 
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理科罗拉多大学波尔得分校毕业证学位证成绩单-可办理
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理e4aez8ss
 
Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 217djon017
 
办理学位证中佛罗里达大学毕业证,UCF成绩单原版一比一
办理学位证中佛罗里达大学毕业证,UCF成绩单原版一比一办理学位证中佛罗里达大学毕业证,UCF成绩单原版一比一
办理学位证中佛罗里达大学毕业证,UCF成绩单原版一比一F sss
 
Real-Time AI Streaming - AI Max Princeton
Real-Time AI  Streaming - AI Max PrincetonReal-Time AI  Streaming - AI Max Princeton
Real-Time AI Streaming - AI Max PrincetonTimothy Spann
 
Decoding the Heart: Student Presentation on Heart Attack Prediction with Data...
Decoding the Heart: Student Presentation on Heart Attack Prediction with Data...Decoding the Heart: Student Presentation on Heart Attack Prediction with Data...
Decoding the Heart: Student Presentation on Heart Attack Prediction with Data...Boston Institute of Analytics
 
INTERNSHIP ON PURBASHA COMPOSITE TEX LTD
INTERNSHIP ON PURBASHA COMPOSITE TEX LTDINTERNSHIP ON PURBASHA COMPOSITE TEX LTD
INTERNSHIP ON PURBASHA COMPOSITE TEX LTDRafezzaman
 
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024thyngster
 
Learn How Data Science Changes Our World
Learn How Data Science Changes Our WorldLearn How Data Science Changes Our World
Learn How Data Science Changes Our WorldEduminds Learning
 
Advanced Machine Learning for Business Professionals
Advanced Machine Learning for Business ProfessionalsAdvanced Machine Learning for Business Professionals
Advanced Machine Learning for Business ProfessionalsVICTOR MAESTRE RAMIREZ
 

Dernier (20)

NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...
NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...
NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...
 
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
 
Data Analysis Project : Targeting the Right Customers, Presentation on Bank M...
Data Analysis Project : Targeting the Right Customers, Presentation on Bank M...Data Analysis Project : Targeting the Right Customers, Presentation on Bank M...
Data Analysis Project : Targeting the Right Customers, Presentation on Bank M...
 
ASML's Taxonomy Adventure by Daniel Canter
ASML's Taxonomy Adventure by Daniel CanterASML's Taxonomy Adventure by Daniel Canter
ASML's Taxonomy Adventure by Daniel Canter
 
Conf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
Conf42-LLM_Adding Generative AI to Real-Time Streaming PipelinesConf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
Conf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
 
Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024
 
Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...
 
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhh
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhhThiophen Mechanism khhjjjjjjjhhhhhhhhhhh
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhh
 
GA4 Without Cookies [Measure Camp AMS]
GA4 Without Cookies [Measure Camp AMS]GA4 Without Cookies [Measure Camp AMS]
GA4 Without Cookies [Measure Camp AMS]
 
Data Factory in Microsoft Fabric (MsBIP #82)
Data Factory in Microsoft Fabric (MsBIP #82)Data Factory in Microsoft Fabric (MsBIP #82)
Data Factory in Microsoft Fabric (MsBIP #82)
 
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理科罗拉多大学波尔得分校毕业证学位证成绩单-可办理
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理
 
Deep Generative Learning for All - The Gen AI Hype (Spring 2024)
Deep Generative Learning for All - The Gen AI Hype (Spring 2024)Deep Generative Learning for All - The Gen AI Hype (Spring 2024)
Deep Generative Learning for All - The Gen AI Hype (Spring 2024)
 
Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2
 
办理学位证中佛罗里达大学毕业证,UCF成绩单原版一比一
办理学位证中佛罗里达大学毕业证,UCF成绩单原版一比一办理学位证中佛罗里达大学毕业证,UCF成绩单原版一比一
办理学位证中佛罗里达大学毕业证,UCF成绩单原版一比一
 
Real-Time AI Streaming - AI Max Princeton
Real-Time AI  Streaming - AI Max PrincetonReal-Time AI  Streaming - AI Max Princeton
Real-Time AI Streaming - AI Max Princeton
 
Decoding the Heart: Student Presentation on Heart Attack Prediction with Data...
Decoding the Heart: Student Presentation on Heart Attack Prediction with Data...Decoding the Heart: Student Presentation on Heart Attack Prediction with Data...
Decoding the Heart: Student Presentation on Heart Attack Prediction with Data...
 
INTERNSHIP ON PURBASHA COMPOSITE TEX LTD
INTERNSHIP ON PURBASHA COMPOSITE TEX LTDINTERNSHIP ON PURBASHA COMPOSITE TEX LTD
INTERNSHIP ON PURBASHA COMPOSITE TEX LTD
 
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
Consent & Privacy Signals on Google *Pixels* - MeasureCamp Amsterdam 2024
 
Learn How Data Science Changes Our World
Learn How Data Science Changes Our WorldLearn How Data Science Changes Our World
Learn How Data Science Changes Our World
 
Advanced Machine Learning for Business Professionals
Advanced Machine Learning for Business ProfessionalsAdvanced Machine Learning for Business Professionals
Advanced Machine Learning for Business Professionals
 

Bay Area Real Estate Guide

  • 1. Bayesian Decision Theory Prof. Dr. Mostafa Gadal-Haqq Faculty of Computer & Information Sciences Computer Science Department AIN SHAMS UNIVERSITY ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 1 CSC446 : Pattern Recognition (Pattern Classifications, Ch2: Sec. 2.1 to Sec. 2.3)
  • 2. 2.1 Bayesian Decision Theory • Bayesian Decision Theory is based on quantifying the trade-offs between various classification decisions using probabilities and the costs that accompany such decisions. • Assumes that: The decision problem is posed in probabilistic terms and that all of the relevant probability values are known. ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 2
  • 3. 2.1 Bayesian Decision Theory • Back to the Fish Sorting Machine: –  = a random variable (State of nature)={1 ,2} • For example: 1 = Sea bass, and 2 = Salmon • P(1 ) = the prior (a priori probability) that the coming fish is sea bass. • P(2 ) = the prior (a priori probability) that the coming fish is salmon. – The priors gives us the knowledge of how likely we are to get salmon or Sea bass before the fish actually appears. ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 3
  • 4. • Decision Rule Using Priors only: – to make a decision about the fish that will appear using only the priors, P(1) and P(2), We use the following decision rule: – which minimize the error. 2.1 Bayesian Decision Theory Decide fish 1 if P(1) > P(2) and fish 2 if P(1) < P(2) Probability of error = min [ P(1) , P(2)] ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 4
  • 5. • That is: – If P(1) >> P(2) we will be right most of the time when we decide that the fish belong to 1 . – If P(1) = P(2) we have only fifty-fifty chance of being right. – Under these conditions, no other decision rules can yield a larger probability of being right. 2.1 Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 5
  • 6. • Improving the decision using observation: 2.1 Bayesian Decision Theory • If we know the class – conditional probability, P(x | j), of an observation x, we could improve our decision. • for example: x describes the observed lightness of the sea bass or salmon P(x|w2) P(x|w1) ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 6
  • 7. • We can improve our decision by using this observed feature and the Bayes rule : – Posterior = (Likelihood x Prior) / Evidence – Where, for C categories :     Cj j jj PxPxP 1 )()|()(  2.1 Bayesian Decision Theory )( )()|( )|( xP PxP xP jj j    ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 7
  • 8. • Bayesian decision is based on minimizing the probability of error , i.e. for a given feature value x : • The probability of error for a particular x is : 2.1 Bayesian Decision Theory Decide x 1 if P(1 | x) > P(2 | x) and x 2 if P(1 | x) < P(2 | x) P(error | x) = min [ P(1 | x), P(2 | x) ] ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 8
  • 9. fish (x)  2 Suppose P(1)=2/3=0.67, and P(2)=1/3= 0.33 , 2.1 Bayesian Decision Theory: Numerical Example P(x|w2) P(x|w1) 0.36 0.15 If x = 11.5, then P(x|1)= 0.15 , P(x|2)= 0.36 P(x) = 0.15*0.67 + 0.36*0.33 = 0.22 P(1|x)= 0.15*0.67/0.22 = 0.46 P(2|x)= 0.36*0.33/0.22 = 0.54 fish  1 ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 9
  • 10. 2.1 Bayesian Decision Theory Computing for all values of x gives decision regions (Rules) : R2 R2R1 R1 • if x  R1 decide 1 • if x  R2 decide 2 ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 10
  • 11. • Draw Probability Densities and find the decision regions for the following Classes:  = {1, 2}, P(x | 1) ~ N(20, 4), P(x | 2) ~ N(15, 2), P(1) = 1/3, and P(2) = 2/3, – Then Classify a sample with feature value x= 17. Assignment 2.1 ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 11
  • 12. 2.2 General Bayesian Decision Theory • Generalization of Bayesian decision theory is done by allowing the following: – Having more than one feature. – Having more than two states of nature. – Allowing actions and not only decide on the state of nature. – Introduce a loss of function which is more general than the probability of error. ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 12
  • 13. • Allowing actions other than classification primarily allows the possibility of rejection • Rejection is refusing to make decision in close or bad cases! • The loss function states: how costly each action taken is? 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 13
  • 14. • Suppose we have c states of nature (categories)  = { 1, 2,…, c } , • a feature vector: x = { x1, x2,…, xd } , • the possible actions  = { 1, 2,…, a } , • and the loss, (i | j ), incurred for taking action i when the state of nature is j . 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 14
  • 15. • The conditional risk, R(i | x), for select the action i is given by:     cj j jjii xPxR 1 )|()|()|(  • The Overall risk, R, is the Sum of all Conditional risks R(i | x) for i = 1,…,a.     ai i i xRR 1 )|( 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 15
  • 16. Take action i (i.e. decide i) if R(i | x) < R(j | x) ;  j and j  i. The Bayesian decision rule becomes: select the action i for which the conditional risk, R(i | x), is minimum. That is : 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 16
  • 17. • Minimizing R(i | x) for all actions, that is: for all i ; i = 1,…, a, is minimizing R. • The overall risk R is the “expected loss associated with a given decision rule”. • The overall risk R is called the Bayes risk, which defines the best performance that can be achieved! 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 17
  • 18. • Two-category classification Example: Suppose we have two categories {1 ,2} and two actions {1 ,2 }, where: 1 : deciding 1 , and 2 : deciding 2 , and for simplicity we write ij = (i | j ) The conditional risks for taking 1 and 2 are: R(1 | x) = 11P(1 | x) + 12P(2 | x) R(2 | x) = 21P(1 | x) + 22P(2 | x) 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 18
  • 19. decide 1 (i.e. 1) if R(1 | x) < R(2 | x) and 2 (i.e. 2) if R(1 | x) > R(2 | x) There are a variety of ways to express the minimum-risk rule, each has its advantage: 1- The fundamental rule is: 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 19
  • 20. 2- The rule in terms of the posteriors is: 3- The rule in terms of the priors and conditional densities is: decide 1 if (21- 11) P(1 | x ) > (12- 22) P(2 | x ) decide 2 otherwise 2.2 General Bayesian Decision Theory decide 1 if (21- 11) P(x | 1 ) P(1) > (12- 22) P(x | 2 ) P(2 ) decide 2 otherwise ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 20
  • 21. 4- The rule in terms of the likelihoods ratios: That is, the Bayes (Optimal) decision can be interpreted as: 2.2 General Bayesian Decision Theory decide 1 if decide 2 otherwise )( )( . )|( )|( 1 2 1121 2212 2 1       P P xp xp    “One can take an optimal decision, if the likelihood ratio exceeds a threshold value that is independent of the observation x” ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 21
  • 22. • Decision regions depends on the values of the loss function: • For different loss function  we have: )( )(2 then 01 20 if )( )( then 01 10 if 1 2 1 2       P P P P b a                            )|( )|( :ifdecidethen )( )( .Let 2 1 1 1 2 1121 2212 xp xp P P 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 22
  • 23. 2.2 General Bayesian Decision Theory ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 23
  • 24. 2.3 Minimum-Error Rate Classification • Consider the zero-one (or symmetrical) loss function: • Therefore, the conditional risk is: • In other words, for symmetric loss function, the conditional risk is the probability of error. cji ji ji ji ,...,1, 1 0 ),(               1j ij cj 1j jjii )x|(P1)x|(P )x|(P)|()x|(R ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 24
  • 25. The Minmax Criterion • Sometimes we need to design our classifier to perform well over a range of prior probabilities, or where we do not know the prior probabilities. • A reasonable approach is to design our classifier so that the worst overall risk for any value of the priors is as small as possible • Minimax Criterion: “minimize the maximum possible overall risk” ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 25
  • 26. The Minmax Criterion • It is found that the overall risk is linear in P(ωj). Then, when the constant of proportionality (the slope) is zero, the risk is independent of priors. This condition gives the minmax risk Rmm as: ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 26
  • 27. The Minmax Criterion ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 27
  • 28. The Neyman-Pearson Criterion • The Neynam-Pearson Criterion: “minimize the overall risk subject to a constraint” • Generally Neyman-Pearson criterion is satisfied by adjusting decision boundaries numerically. However, for Gaussian and some other distributions, its solution can be found analytically.  R(αi|x) dx < constant ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 28
  • 29. • Computer Exercises: – Find the optimal decision for the following data:  = {1, 2}, p(x | 1) ~ N(20, 4), p(x | 2) ~ N(15, 2), P(1) = 2/3, and P(2) = 1/3, – With a loss function: – Then classify the samples: x = 12, 17, 18, and 20.        12 .511  Assignment 2.2 ASU-CSC446 : Pattern Recognition. Prof. Dr. Mostafa Gadal-Haqq slide - 29