SlideShare une entreprise Scribd logo
1  sur  70
Télécharger pour lire hors ligne
Some of the things you wanted to know about
       uncertainty (and were too busy to ask)

                                 Sébastien Destercke

                             Heudiasyc, CNRS Compiegne, France


                                      SUM 2012




Sébastien Destercke (CNRS)               SUM tutorial            SUM 2012   1 / 56
Me, myself and I



    2005-2008: PhD student
            Main topic: uncertainty modelling and treatment in nuclear safety
    2009-2011: research engineer in agronomical research institute
            Main topic: aid-decision in agronomical production chains
    2011- ?: researcher at Centre National de la Recherche
    Scientifique (CNRS), in the Heudiasyc joint unit research
            Main (current) topics: reliability analysis and machine learning
Only common point: the modelling and handling of uncertainty
(including an imprecision component)




 Sébastien Destercke (CNRS)          SUM tutorial                    SUM 2012   2 / 56
Tutorial goals and contents

What you will find in this tutorial
     Mostly practical considerations about uncertainty
     An overview of "mainstream" uncertainty theories
     Elements and illustrations of their use to
             build or learn uncertainty representations
             make inference (and decision)
     A "personal" view about those things

What you will not find in this tutorial
     A deep and exhaustive study of a particular topic
     Elements about other important problems (learning models, information
     fusion/revision)

  Sébastien Destercke (CNRS)          SUM tutorial             SUM 2012   3 / 56
Introductory elements


Plan



1    Introductory elements


2    How to represent uncertainty?


3    How to draw conclusions from information and decide?


4    Some final comments




    Sébastien Destercke (CNRS)                    SUM tutorial   SUM 2012   4 / 56
Introductory elements


Section goals: it’s all about basics




     Introduce a basic framework
     Give basic ideas about uncertainty
     Introduce some basic problems




  Sébastien Destercke (CNRS)                    SUM tutorial   SUM 2012   5 / 56
Introductory elements


A generic framework



                                     data (population):
         datum: ω
                                       {ω1 , . . . , ωn }         +         model


singular information                                  generic information


    model describes a relation in data space
    singular information: concern a particular situation/individual
    generic information: describe a general relationship, the behaviour
    of a population, . . .



 Sébastien Destercke (CNRS)                    SUM tutorial                    SUM 2012   6 / 56
Introductory elements


Uncertainty origins

Uncertainty: inability to answer precisely a question about a quantity
Can concern both:
    Singular information
             items in a data-base, values of some logical variables, time before
             failure of a component
     Generic information
             parameter values of classifiers/regression models, time before
             failure of components, truth of a logical sentence ("birds fly")

Main origins
     Variability of a population → only concerns generic information
     Imprecision due to a lack of information
     Conflict between different sources of information (data/expert)


  Sébastien Destercke (CNRS)                    SUM tutorial         SUM 2012   7 / 56
Introductory elements


Some examples


                                      data (population):
          datum: ω
                                        {ω1 , . . . , ωn }         +         model


singular information                                   generic information

Classification
    Data space=input features X × (structured) classes Y
     model: classifier with parameters
     Uncertainty: mostly about model parameters
     Common problem: predict classes of individuals (singular
     information)


  Sébastien Destercke (CNRS)                    SUM tutorial                    SUM 2012   8 / 56
Introductory elements


Some examples


                                      data (population):
          datum: ω
                                        {ω1 , . . . , ωn }         +         model


singular information                                   generic information

Risk and reliability analysis
     Data space=input variables X × output variable(s) Y
     Model: transfer/structure function f : X → Y
     Uncertainty: very often about X (sometimes f parameters)
     Common problem: obtain information about Y, either generic
     (failure of products) or singular (nuclear power plant)


  Sébastien Destercke (CNRS)                    SUM tutorial                    SUM 2012   8 / 56
Introductory elements


Some examples


                                      data (population):
          datum: ω
                                        {ω1 , . . . , ωn }         +         model


singular information                                   generic information

Data mining/clustering
     Data space=data features
     Model: clusters, rules, . . .
     Uncertainty: mostly about model parameters
     Common problem: obtain the model from data {ω1 , . . . , ωn }



  Sébastien Destercke (CNRS)                    SUM tutorial                    SUM 2012   8 / 56
Introductory elements


Some examples


                                     data (population):
         datum: ω
                                       {ω1 , . . . , ωn }         +         model


singular information                                  generic information

Data base querying
    Data space=data features
    Model: a query inducing preferences over observations
    Uncertainty: mostly about the query, sometimes data
    Common problem: retrieve and order interesting items in
    {ω1 , . . . , ωn }


 Sébastien Destercke (CNRS)                    SUM tutorial                    SUM 2012   8 / 56
Introductory elements


Some examples


                                      data (population):
          datum: ω
                                        {ω1 , . . . , ωn }         +         model


singular information                                   generic information

Propositional logic
     Data space=set of possible interpretations
     Model: set of sentences of the language
     Uncertainty: on sentences or on the state of some atoms
     Common problem: deduce the uncertainty about the truth of a
     sentence S from facts and knowledge


  Sébastien Destercke (CNRS)                    SUM tutorial                    SUM 2012   8 / 56
Introductory elements


Handling uncertainty

                                     data (population):
         datum: ω
                                       {ω1 , . . . , ωn }         +         model


singular information                                  generic information

Common problems in one sentence
    Learning: use singular information to estimate generic information
    Inference: interrogate model and observations to deduce information on
    quantity of interest
    Information fusion: merge multiple information pieces about same
    quantity
    Information revision: merge new information with old one

 Sébastien Destercke (CNRS)                    SUM tutorial                    SUM 2012   9 / 56
How to represent uncertainty?


Plan



1    Introductory elements


2    How to represent uncertainty?


3    How to draw conclusions from information and decide?


4    Some final comments




    Sébastien Destercke (CNRS)                       SUM tutorial   SUM 2012   10 / 56
How to represent uncertainty?


Section goals




    Introduce main ideas of theories
    Provide elements about links between them
    Illustrate how to get uncertainty representations within each




 Sébastien Destercke (CNRS)                       SUM tutorial   SUM 2012   11 / 56
How to represent uncertainty?


Basic framework

Quantity S with possible exclusive states S = {s1 , . . . , sn }
     S: data feature, model parameter, . . .
Basic tools
A confidence degree µ : 2|S| → [0, 1] is such that
     µ(A): confidence S ∈ A
     µ(∅) = 0, µ(S) = 1
     A ⊆ B ⇒ µ(A) ≤ µ(B)
Uncertainty modelled by 2 degrees µ, µ : 2|S| → [0, 1]:
     µ(A) ≤ µ(A) (monotonicity)
     µ(A) = 1 − µ(Ac ) (duality)



  Sébastien Destercke (CNRS)                       SUM tutorial    SUM 2012   12 / 56
How to represent uncertainty?


Probability

Basic tool
A probability distribution p : S → [0, 1] from which
     µ(A) = µ(A) = µ(A) =                      s∈A p(s)
     µ(A) = 1 − µ(Ac ): auto-dual

Main interpretations
     Frequentist [3]: µ(A)= number of times A observed in a
     population
         only applies when THERE IS a population
     Subjectivist [1]: µ(A)= price for gamble giving 1 if A happens, 0 if
     not
         applies to singular situation and populations


  Sébastien Destercke (CNRS)                       SUM tutorial   SUM 2012   13 / 56
How to represent uncertainty?


Probability and imprecision: short comment



    Probability often partially specified over S
    Probability on rest of S usually imprecise

A small example
    S = {s1 , s2 , s3 , s4 }
    p(s1 ) = 0.1, p(s2 ) = 0.4
    we deduce p(si ) ∈ [0, 0.5] for i = 3, 4




 Sébastien Destercke (CNRS)                       SUM tutorial   SUM 2012   14 / 56
How to represent uncertainty?


Probability and imprecision: short comment



     Probability often partially specified over S
     Probability on rest of S usually imprecise

Another (logical) example
     q, r two propositional variables
     P(¬q ∨ r ) = α, P(q) = β
     we deduce P(r ) ∈ [β − 1 + α, α]




  Sébastien Destercke (CNRS)                       SUM tutorial   SUM 2012   14 / 56
How to represent uncertainty?


Sets

Basic tool
A set E ⊆ S with true value S ∈ E from which
     E ⊆ A → µ(A) = µ(A) = 1 (certainty truth in A)
     E ∩ A = ∅, E ∩ Ac = ∅ → µ(A) = 0, µ(A) = 1 (ignorance)
     E ∩ A = ∅ → µ(A) = µ(A) = 0 (truth cannot be in A)
µ, µ are binary → limited expressiveness


Classical use of sets:
     Interval analysis [2] (E is a subset of R)
     Propositional logic (E is the set of models of a KB)
Other cases: robust optimisation, decision under risk, . . .


  Sébastien Destercke (CNRS)                       SUM tutorial   SUM 2012   15 / 56
How to represent uncertainty?


In summary


Probabilities . . .
     (+) very informative quantification (do we need it?)
     (-) need lots of information (do we have it?)
     (-) if not enough, requires a choice (do we want to do that?)
     use probabilistic calculus (convolution, stoch. independence, . . . )
Sets . . .
     (+) need very few information
     (-) very rough quantification of uncertainty (Is it sufficient for us?)
     use set calculus (interval analysis, Cartesian product, . . . )
→ Need of frameworks bridging these two



  Sébastien Destercke (CNRS)                       SUM tutorial   SUM 2012   16 / 56
How to represent uncertainty?


Possibility theory


Basic tool
A distribution π : S → [0, 1], usually with si such that π(si ) = 1, from
which
     µ(A) = maxs∈A π(s) (Possibility measure)
     µ(A) = 1 − µ(Ac ) = mins∈Ac (1 − π(s)) (Necessity measure)
Sets E captured by π(s) = 1 if s ∈ E, 0 otherwise

[µ, µ] as
     confidence degrees of possibility theory [9]
     bounds of an ill-known probability µ ⇒ µ ≤ µ ≤ µ [10]



  Sébastien Destercke (CNRS)                       SUM tutorial   SUM 2012   17 / 56
How to represent uncertainty?


A nice characteristic: Alpha-cut [5]

Definition

                                      Aα = {s ∈ S|π(s) ≥ α}
     µ(Aα ) = 1 − α
     If β ≤ α, Aα ⊆ Aβ
Simulation: draw α ∈ [0, 1] and associate Aα

      1
                                                          Aβ      π
     β
                                                          Aα
     α

                                                                             S
⇒ Possibilistic approach ideal to model nested structures

  Sébastien Destercke (CNRS)                       SUM tutorial       SUM 2012   18 / 56
How to represent uncertainty?


A basic distribution: simple support


 A set E of most plausible values
 A confidence degree α = µ(E)
 Two interesting cases:                                          pH value ∈ [4.5, 5.5] with
      Expert providing most                                 α = 0.5 (∼ "more probable than")
      plausible values E                                                           π
                                                                 1.0
      E set of models of a formula φ
                                                                 0.8
 Both cases extend to multiple                                   0.6
 sets E1 , . . . , Ep :                                          0.4
                                                                 0.2
      confidence degrees over
                                                                   0
      nested sets [36]                                                 3   4 4.5       5.5 6   7

      hierarchical knowledge bases
      [33]


 Sébastien Destercke (CNRS)                       SUM tutorial                            SUM 2012   19 / 56
How to represent uncertainty?


A basic distribution: simple support

                                                                       variables p, q
 A set E of most plausible values
                                                                 Ω = {pq, ¬pq, p¬q, ¬p¬q}
 A confidence degree α = µ(E)                                          µ(p ⇒ q) = 0.9
 Two interesting cases:                                             (∼ "almost certain")
      Expert providing most                                        E = {pq, p¬q, ¬p¬q}
      plausible values E
      E set of models of a formula φ                             π(pq) = π(p¬q) = π(¬p¬q) = 1

 Both cases extend to multiple                                   π(¬pq) = 0.1
 sets E1 , . . . , Ep :                                           1.0
      confidence degrees over                                      0.8
      nested sets [36]                                            0.6
      hierarchical knowledge bases                                0.4
                                                                  0.2
      [33]
                                                                    0
                                                                        pq   p¬q   ¬pq   ¬p¬q


 Sébastien Destercke (CNRS)                       SUM tutorial                           SUM 2012   19 / 56
How to represent uncertainty?


Normalized likelihood as possibilities [8] [26]



                                                                       π
                                                                  1
       π(θ) = L(θ|x)/maxθ∈Θ L(θ|x)

 Binomial situation:
       θ = success probability
       x number of observed
       successes

       x= 4 succ. out of 11                                           4/11         θ
       x= 20 succ. out of 55




  Sébastien Destercke (CNRS)                       SUM tutorial              SUM 2012   20 / 56
How to represent uncertainty?


Partially specified probabilities [25] [32]



 Triangular distribution: [µ, µ]
 encompass all probabilities with
                                                             1
       mode/reference value M                                             π
       support domain [a, b].

 Getting back to pH
       M=5                                                        3   5            7     pH
       [a, b] = [3, 7]




  Sébastien Destercke (CNRS)                       SUM tutorial               SUM 2012   21 / 56
How to represent uncertainty?


Other examples




    Statistical inequalities (e.g., Chebyshev inequality) [32]
    Linguistic information (fuzzy sets) [28]
    Approaches based on nested models




 Sébastien Destercke (CNRS)                       SUM tutorial   SUM 2012   22 / 56
How to represent uncertainty?


Possibility: limitations



                                      µ(A) > 0 ⇒ µ(A) = 1
                                      µ(A) < 1 ⇒ µ(A) = 0


 ⇒ interval [µ(A), µ(A)] with one trivial bound
Does not include probabilities as special case:
 ⇒ possibility and probability at odds
 ⇒ respective calculus hard (sometimes impossible?) to reconcile




  Sébastien Destercke (CNRS)                       SUM tutorial   SUM 2012   23 / 56
How to represent uncertainty?


Going beyond



Extend the theory
 ⇒ by complementing π with a lower distribution δ (δ ≤ π ) [11], [31]
 ⇒ by working with interval-valued possibility/necessity degrees [4]
 ⇒ by working with sets of possibility measures [7]

Use a more general model
⇒ Random sets and belief functions




 Sébastien Destercke (CNRS)                       SUM tutorial   SUM 2012   24 / 56
How to represent uncertainty?


Random sets and belief functions
Basic tool
A positive distribution m : 2|S| → [0, 1], with                           E   m(E) = 1 and usually
m(∅ = 0), from which
      µ(A) =         E∩A=∅ m(E)           (Plausibility measure)
      µ(A) =         E⊆A m(E)         = 1 − µ(Ac ) (Belief measure)
m(E1 )
m(E2 )                                                            µ(A) = m(E1 ) + m(E2 )
m(E3 )
m(E4 )                                                            µ(A) = m(E1 ) + m(E2 ) +
m(E5 )                                                            m(E3 ) + m(E5 )
  A

[µ, µ] as
      confidence degrees of evidence theory [16], [17]
      bounds of an ill-known probability µ ⇒ µ ≤ µ ≤ µ [14]
  Sébastien Destercke (CNRS)                       SUM tutorial                            SUM 2012   25 / 56
How to represent uncertainty?


special cases


Measures [µ, µ] include:
     Probability distributions: mass on atoms/singletons
     Possibility distributions: mass on nested sets

                                                              E4

                                                                   E3

                                                                        E2

                                                                         E1




  Sébastien Destercke (CNRS)                       SUM tutorial               SUM 2012   26 / 56
How to represent uncertainty?


Frequencies of imprecise observations


Imprecise poll: "Who will win the next Wimbledon tournament?"
 N(adal)           F(ederer)               D(jokovic)             M(urray)   O(ther)


60 % replied {N, F , D} → m({N, F , D}) = 0.6
15 % replied "I do not know" {N, F , D, M, O} → m(S) = 0.15
10 % replied Murray {M} → m({M}) = 0.1
5 % replied others {O} → m({O}) = 0.05
...




  Sébastien Destercke (CNRS)                       SUM tutorial                   SUM 2012   27 / 56
How to represent uncertainty?


P-box [34]

                                                           Expert providing percentiles
 A pair [F , F ] of cumulative                                         0 ≤ P([−∞, 12]) ≤ 0.2
 distributions
                                                                   0.2 ≤ P([−∞, 24]) ≤ 0.4
 Bounds over events [−∞, x]
                                                                   0.6 ≤ P([−∞, 36]) ≤ 0.8
      Percentiles by experts;
                                                                 1.0                            E5
      Kolmogorov-Smirnov bounds;
                                                                                                E4
 Can be extended to any                                          0.5                       E3
                                                                                           E2
 pre-ordered space [30], [37] ⇒                                                    E1
 multivariate spaces!
                                                                         6 12 18 24 30 36 42




 Sébastien Destercke (CNRS)                       SUM tutorial                          SUM 2012     28 / 56
How to represent uncertainty?


Other means to get random sets/belief functions




    Extending modal logic: probability of provability [18]
    Parameter estimation using pivotal quantities [15]
    Statistical confidence regions [29]
    Modify source information by its reliability [35]
    ...




 Sébastien Destercke (CNRS)                       SUM tutorial   SUM 2012   29 / 56
How to represent uncertainty?


Limits of random sets


    Not yet satisfactory extension of Bayesian/subjective approach
    Still some items of information it cannot model in a simple way,
    e.g.,
            probabilistic bounds over atoms si (imprecise histograms, . . . ) [27];
            comparative assessments such as 2P(B) ≤ P(A)




                                       6 12 18 24 30 36 42




 Sébastien Destercke (CNRS)                       SUM tutorial        SUM 2012   30 / 56
How to represent uncertainty?


Imprecise probabilities


Basic tool
A set P of probabilities on S or an equivalent representation
     µ(A) = supP∈P P(A) (Upper probability)
     µ(A) = infP∈P P(A) = 1 − µ(Ac ) (Lower probability)


[µ, µ] as
     subjective lower and upper betting rates [23]
     bounds of an ill-known probability measure
     µ ⇒ µ ≤ µ ≤ µ [19] [24]




  Sébastien Destercke (CNRS)                       SUM tutorial   SUM 2012   31 / 56
How to represent uncertainty?


Means to get Imprecise probabilistic models


    Include all representations seen so far . . .
    . . . and a couple of others
            probabilistic comparisons
            density ratio-class
            expectation bounds
            ...
    fully coherent extension of Bayesian approach

                                          P(θ|x) = L(θ|x)P(θ)

    → often easy for "conjugate prior" [22]
    make probabilistic logic approaches imprecise [21, 20]



 Sébastien Destercke (CNRS)                       SUM tutorial   SUM 2012   32 / 56
How to represent uncertainty?


A crude summary
Possibility distributions
     +: very simple, natural in many situations (nestedness), extend
     set-based approach
     -: at odds with probability theory, limited expressiveness

Random sets
   +: include probabilities and possibilities, include many models
   used in practice
     -: general models can be intractable, limited expressiveness

Imprecise probabilities
     +: most consistent extension of probabilistic approach, very
     flexible
     -: general models can be intractable
  Sébastien Destercke (CNRS)                       SUM tutorial   SUM 2012   33 / 56
How to represent uncertainty?


A not completely accurate but useful picture

Able to model variability Incompleteness tolerant




                                                                                           General tractability(scalability)
                              Imprecise
                              probability




                                                                 Expressivity/flexibility
                        Random sets



        Probability                            Possibility



                                                    Sets



 Sébastien Destercke (CNRS)                       SUM tutorial                             SUM 2012                            34 / 56
How to draw conclusions from information and decide?


Plan



1    Introductory elements


2    How to represent uncertainty?


3    How to draw conclusions from information and decide?


4    Some final comments




    Sébastien Destercke (CNRS)                         SUM tutorial   SUM 2012   35 / 56
How to draw conclusions from information and decide?


Section goals




    Introduce the inference problem
    Introduce the notion of joint models
    Introduce how (basic) decision can be done
    Give some basic illustrations, mainly from
    regression/classification/reliability




 Sébastien Destercke (CNRS)                         SUM tutorial   SUM 2012   36 / 56
How to draw conclusions from information and decide?


The problem


                                          data (population):
         datum: ω
                                            {ω1 , . . . , ωn }         +         model


singular information                                       generic information


    uncertain Input: marginal pieces of information on a part of the
    data space and the model
    Step 1: build a joint model from marginal information
    Step 2: deduce information (by propagation, conditioning, . . . ) on
    data


 Sébastien Destercke (CNRS)                         SUM tutorial                   SUM 2012   37 / 56
How to draw conclusions from information and decide?


Closeness requirement


                                           data (population):
          datum: ω
                                             {ω1 , . . . , ωn }         +         model


singular information                                        generic information


     partial/marginal pieces of information are x
     joint model is x
     deduced information is x
where x ∈ {Prob. distribution, Poss. distribution, Belief function, Prob.
set}


  Sébastien Destercke (CNRS)                         SUM tutorial                   SUM 2012   38 / 56
How to draw conclusions from information and decide?


Straight ahead

                                                y = ax + b

              No uncertainty: a = 2.5, b = 3.5 → joint: 2.5 × 3.5
                                          Infer y when x = 3
                      y
                    20



                    10




                                         1                 2       3   4x


 Sébastien Destercke (CNRS)                         SUM tutorial            SUM 2012   39 / 56
How to draw conclusions from information and decide?


Straight ahead

                                                y = ax + b

            Imprecision: a ∈ [2, 3], b ∈ [3, 4] → joint: [2, 3] × [3, 4]
                                          Infer y when x = 3
                       y
                     20



                     10




                                         1                 2       3   4x


 Sébastien Destercke (CNRS)                         SUM tutorial            SUM 2012   39 / 56
How to draw conclusions from information and decide?


Joint models: possibilistic illustration


 1                           π
                                                                           π
               2    5    4       3   b                                 1



       1                     π                                                       a

               1    2    3       4   a                b




     Sébastien Destercke (CNRS)                         SUM tutorial           SUM 2012   40 / 56
How to draw conclusions from information and decide?


Fuzzy straight ahead

                                                y = ax + b

                            Possibilistic uncertainty on a and b
                                          Infer y when x = 3
                       y
                     20



                     10




                                         1                 2       3   4x


 Sébastien Destercke (CNRS)                         SUM tutorial            SUM 2012   41 / 56
How to draw conclusions from information and decide?


Fuzzy straight ahead

                                                y = ax + b

                            Possibilistic uncertainty on a and b
                                          Infer y when x = 3
                       y
                     20



                     10




                                         1                 2       3   4x


 Sébastien Destercke (CNRS)                         SUM tutorial            SUM 2012   41 / 56
How to draw conclusions from information and decide?


Reliable or not?


                     Model: structure function φ : C1 × C2 → S
          p(0)=0.1
          p(1)=0.9                      C1 : {0, 1}

                                              p(0 × 0) = 0.01
                                              p(0 × 1) = 0.09
                                              p(1 × 0) = 0.09      S : {0, 1}
                                              p(1 × 1) = 0.81       p(0) = 0.01
                                                                    p(1) = 0.99
          p(0)=0.1
          p(1)=0.9                      C2 : {0, 1}


 Sébastien Destercke (CNRS)                         SUM tutorial     SUM 2012   42 / 56
How to draw conclusions from information and decide?


Reliable or not?


                     Model: structure function φ : C1 × C2 → S
 m({0}) = 0.05
 m({1}) = 0.75                          C1 : {0, 1}
 m({0, 1}) = 0.2

                                          m({0} × {0}) = 0.025
                                          m({1} × {1}) = 0.5625
                                          m({0} × {0, 1}) = 0.01     S : {0, 1}
                                         m({0, 1} × {0, 1}) = 0.04
                                                    ...
                                                                     m({0}) = 0.0025
 m({0}) = 0.05                                                       m({1}) = 0.9575
 m({1}) = 0.75                          C2 : {0, 1}                  m({0, 1}) = 0.04
 m({0, 1}) = 0.2



 Sébastien Destercke (CNRS)                         SUM tutorial         SUM 2012   42 / 56
How to draw conclusions from information and decide?


Two kinds of decision



     Binary: whether to take an action or not
             risk/reliability analysis (take the risk or not)
             logic (decide if a sentence is true)
             binary classification
     Non-binary: decide among multiple choices
             classification
             control, planing, . . .
Introducing imprecision                    allowing for incomparability




  Sébastien Destercke (CNRS)                         SUM tutorial         SUM 2012   43 / 56
How to draw conclusions from information and decide?


Binary case
                         A threshold τ , two decisions {1, −1}
                                           τ
                       µ(a)
     0                                                                                1




                     Decide {−1}                                   Decide {1}

    Decision: -1




 Sébastien Destercke (CNRS)                         SUM tutorial                SUM 2012   44 / 56
How to draw conclusions from information and decide?


Binary case
                         A threshold τ , two decisions {1, −1}
                                           τ
                       µ(a)       µ(a)
     0                                                                                1




                     Decide {−1}                                   Decide {1}

    Decision: -1




 Sébastien Destercke (CNRS)                         SUM tutorial                SUM 2012   44 / 56
How to draw conclusions from information and decide?


Binary case
                          A threshold τ , two decisions {1, −1}
                                            τ
                                   µ(a)                      µ(a)
     0                                                                                1




                     Decide {−1}                                   Decide {1}


    No winner : -1,1
    Maximin : -1
    Maximax : 1
 Sébastien Destercke (CNRS)                         SUM tutorial                SUM 2012   44 / 56
How to draw conclusions from information and decide?


Multiple choice case

     µ
    5

    4

    3

    2

    1


                 y1                  y2                    y3      y4   y5     Y


 Sébastien Destercke (CNRS)                         SUM tutorial        SUM 2012   45 / 56
How to draw conclusions from information and decide?


Multiple choice case

     µ
    5

    4                                                  s
                                          Dominate                 Do
                                                                     mi
                                                                        na
                                                                          tes
    3
                                                                                     Dominates

    2

    1                         Dominance: {y1 , y3 , y5 }


                 y1                  y2                    y3                   y4               y5     Y


 Sébastien Destercke (CNRS)                         SUM tutorial                                 SUM 2012   45 / 56
How to draw conclusions from information and decide?


Multiple choice case

     µ
    5

    4

    3

    2

    1                                Maximin: {y3 }


                 y1                  y2                    y3      y4   y5     Y


 Sébastien Destercke (CNRS)                         SUM tutorial        SUM 2012   45 / 56
How to draw conclusions from information and decide?


Multiple choice case

     µ
    5

    4

    3

    2

    1                               Maximax: {y1 }


                 y1                  y2                    y3      y4   y5     Y


 Sébastien Destercke (CNRS)                         SUM tutorial        SUM 2012   45 / 56
Some final comments


Plan



1    Introductory elements


2    How to represent uncertainty?


3    How to draw conclusions from information and decide?


4    Some final comments




    Sébastien Destercke (CNRS)                 SUM tutorial   SUM 2012   46 / 56
Some final comments


Why modelling uncertainty (outside intellectual
satisfaction)?



Because . . .
     . . . you should (risk/reliability analysis)
     . . . it solves existing issues (non-monotonic reasoning)
     . . . it gives better/more robust results with acceptable
     computational burden




  Sébastien Destercke (CNRS)                 SUM tutorial        SUM 2012   47 / 56
Some final comments


Scalability



Adding flexibility to the model → increases scalability issue
     already true for probability and intervals
     only get worse if model more complex
How to solve it? As in other domains
     approximation, model reduction, . . . → make things as simple as
     possible (but not simpler) to answer your question
     sampling
     use flexibility only where you need it




  Sébastien Destercke (CNRS)                 SUM tutorial      SUM 2012   48 / 56
Some final comments


One advantage of incompleteness


  Using approximations: choice between outer/inner approximation
                                x1




                              x2                            x3




 Sébastien Destercke (CNRS)                  SUM tutorial        SUM 2012   49 / 56
Some final comments


References I
General bibliography
[1] B.de Finetti.
    Theory of probability, volume 1-2.
    Wiley, NY, 1974.
    Translation of 1970 book.
[2] L. Jaulin, M. Kieffer, O. Didrit, and E. Walter.
    Applied Interval Analysis.
    London, 2001.
[3] R. von Mises.
    Probability, Statistics and Truth.
    Dover books explaining science. Dover Publications, 1981.
Possibility theory
[4]    Salem Benferhat, Julien Hué, Sylvain Lagrue, and Julien Rossit.
       Interval-based possibilistic logic.
       In IJCAI, pages 750–755, 2011.
[5]    I. Couso, S. Montes, and P. Gil.
       The necessity of the strong alpha-cuts of a fuzzy set.
       Int. J. on Uncertainty, Fuzziness and Knowledge-Based Systems, 9:249–262, 2001.
[6]    G. de Cooman and D. Aeyels.
       Supremum-preserving upper probabilities.
       Information Sciences, 118:173–212, 1999.
[7]    D. Dubois.
       Fuzzy measures on finite scales as families of possibility measures.
       In Proc. European Society for Fuzzy Logic and Technology conference, 2011.



      Sébastien Destercke (CNRS)                          SUM tutorial                   SUM 2012   50 / 56
Some final comments


References II
[8]    D. Dubois, S. Moral, and H. Prade.
       A semantics for possibility theory based on likelihoods,.
       Journal of Mathematical Analysis and Applications, 205(2):359 – 380, 1997.
[9]    D. Dubois and H. Prade.
       Possibility Theory: An Approach to Computerized Processing of Uncertainty.
       Plenum Press, New York, 1988.
[10] D. Dubois and H. Prade.
     When upper probabilities are possibility measures.
     Fuzzy Sets and Systems, 49:65–74, 1992.
[11] Didier Dubois and Henri Prade.
     An overview of the asymmetric bipolar representation of positive and negative information in possibility theory.
     Fuzzy Sets and Systems, 160(10):1355–1366, 2009.
[12] G. L. S. Shackle.
     Decision, Order and Time in Human Affairs.
     Cambridge University Press, Cambridge, 1961.
[13] L.A. Zadeh.
     Fuzzy sets as a basis for a theory of possibility.
     Fuzzy Sets and Systems, 1:3–28, 1978.
Random sets and belief functions
[14] A.P. Dempster.
     Upper and lower probabilities induced by a multivalued mapping.
     Annals of Mathematical Statistics, 38:325–339, 1967.
[15] Ryan Martin, Jianchun Zhang, and Chuanhai Liu.
     Dempster-shafer theory and statistical inference with weak beliefs.
     Technical report, 2008.


      Sébastien Destercke (CNRS)                           SUM tutorial                                      SUM 2012   51 / 56
Some final comments


References III

[16] G. Shafer.
     A mathematical Theory of Evidence.
     Princeton University Press, New Jersey, 1976.
[17] P. Smets.
     The transferable belief model and other interpretations of dempster-shafer’s model.
     In Proc. of the Sixth Annual Confernce on Uncertainty in Artifical Intelligence, pages 375–384, 1990.
[18] Philippe Smets.
     Probability of provability and belief functions.
     Logique et Analyse, 133-134:177–195, 1991.
Imprecise probability
[19] J. O. Berger.
     An overview of robust Bayesian analysis.
     Test, 3:5–124, 1994.
     With discussion.
[20] Gert de Cooman.
     Belief models: An order-theoretic investigation.
     Ann. Math. Artif. Intell., 45(1-2):5–34, 2005.
[21] Pierre Hansen, Brigitte Jaumard, Marcus Poggi de Aragão, Fabien Chauny, and Sylvain Perron.
     Probabilistic satisfiability with imprecise probabilities.
     Int. J. Approx. Reasoning, 24(2-3):171–189, 2000.
[22] Erik Quaeghebeur and Gert de Cooman.
     Imprecise probability models for inference in exponential families.
     In ISIPTA, pages 287–296, 2005.




   Sébastien Destercke (CNRS)                              SUM tutorial                                     SUM 2012   52 / 56
Some final comments


References IV
[23] P. Walley.
     Statistical reasoning with imprecise Probabilities.
     Chapman and Hall, New York, 1991.
[24] Kurt Weichselberger.
     The theory of interval-probability as a unifying concept for uncertainty.
     International Journal of Approximate Reasoning, 24(2–3):149 – 170, 2000.
Practical representations
[25] C. Baudrit and D. Dubois.
     Practical representations of incomplete probabilistic knowledge.
     Computational Statistics and Data Analysis, 51(1):86–108, 2006.
[26] M. Cattaneo.
     Likelihood-based statistical decisions.
     In Proc. 4th International Symposium on Imprecise Probabilities and Their Applications, pages 107–116, 2005.
[27] L.M. de Campos, J.F. Huete, and S. Moral.
     Probability intervals: a tool for uncertain reasoning.
     I. J. of Uncertainty, Fuzziness and Knowledge-Based Systems, 2:167–196, 1994.
[28] G. de Cooman and P. Walley.
     A possibilistic hierarchical model for behaviour under uncertainty.
     Theory and Decision, 52:327–374, 2002.
[29] T. Denoeux.
     Constructing belief functions from sample data using multinomial confidence regions.
     I. J. of Approximate Reasoning, 42:228–252, 2006.
[30] S. Destercke, D. Dubois, and E. Chojnacki.
     Unifying practical uncertainty representations: I generalized p-boxes.
     Int. J. of Approximate Reasoning, 49:649–663, 2008.


   Sébastien Destercke (CNRS)                              SUM tutorial                                  SUM 2012   53 / 56
Some final comments


References V
[31] S. Destercke, D. Dubois, and E. Chojnacki.
     Unifying practical uncertainty representations: II clouds.
     Int. J. of Approximate Reasoning (in press), pages 664–677, 2008.
[32] D. Dubois, L. Foulloy, G. Mauris, and H. Prade.
     Probability-possibility transformations, triangular fuzzy sets, and probabilistic inequalities.
     Reliable Computing, 10:273–297, 2004.
[33] Didier Dubois and Henri Prade.
     Possibilistic logic: a retrospective and prospective view.
     Fuzzy Sets and Systems, 144(1):3 – 23, 2004.
[34] S. Ferson, L. Ginzburg, V. Kreinovich, D.M. Myers, and K. Sentz.
     Constructing probability boxes and dempster-shafer structures.
     Technical report, Sandia National Laboratories, 2003.
[35] Frédéric Pichon, Didier Dubois, and Thierry Denoeux.
     Relevance and truthfulness in information correction and fusion.
     Int. J. Approx. Reasoning, 53(2):159–175, 2012.
[36] S.A. Sandri, D. Dubois, and H.W. Kalfsbeek.
     Elicitation, assessment and pooling of expert judgments using possibility theory.
     IEEE Trans. on Fuzzy Systems, 3(3):313–335, August 1995.
[37] Matthias C. M. Troffaes and Sébastien Destercke.
     Probability boxes on totally preordered spaces for multivariate modelling.
     Int. J. Approx. Reasoning, 52(6):767–791, 2011.
Independence notions
[38] B. Bouchon-Meunier, G. Coletti, and C. Marsala.
     Independence and possibilistic conditioning.
     Annals of Mathematics and Artificial Intelligence, 35:107–123, 2002.


   Sébastien Destercke (CNRS)                                SUM tutorial                              SUM 2012   54 / 56
Some final comments


References VI

[39] I. Couso and S. Moral.
     Independence concepts in evidence theory.
     International Journal of Approximate Reasoning, 51:748–758, 2010.
[40] I. Couso, S. Moral, and P. Walley.
     A survey of concepts of independence for imprecise probabilities.
     Risk Decision and Policy, 5:165–181, 2000.
[41] L. de Campos and J. Huete.
     Independence concepts in possibility theory: Part ii.
     Fuzzy Sets and Systems, 103:487–505, 1999.
[42] G. de Cooman.
     Possibility theory III: possibilistic independence.
     International Journal of General Systems, 25:353–371, 1997.
[43] G. de Cooman and E. Miranda.
     Independent natural extension for sets of desirable gambles.
     In F. Coolen, G. de Cooman, T. Fetz, and M. Oberguggenberger, editors, ISIPTA ’11: Proceedings of the Seventh
     International Symposium on Imprecise Probabilities: Theories and Applications, pages 169–178, Innsbruck, 2011. Action M
     Agency for SIPTA.
[44] G. de Cooman, E. Miranda, and M. Zaffalon.
     Independent natural extension.
     Artificial Intelligence, 174:1911–1950, 2011.
[45] D. Dubois, L. Farinas del Cerro, A. Herzig, and H. Prade.
     A roadmap of qualitative independence.
     In D. Dubois, H. Prade, and E.P. Klement, editors, Fuzzy sets, logics, and reasoning about knowledge. Springer, 1999.




   Sébastien Destercke (CNRS)                                SUM tutorial                                 SUM 2012           55 / 56
Some final comments


References VII

[46] B. Ben Yaghlane, P. Smets, and K. Mellouli.
      Belief function independence: I. the marginal case.
      I. J. of Approximate Reasoning, 29(1):47–70, 2002.
Inference
[47] C. Baudrit, I. Couso, and D. Dubois.
     Joint propagation of probability and possibility in risk analysis: towards a formal framework.
     Int. J. of Approximate Reasoning, 45:82–105, 2007.
[48] T. Fetz and M. Oberguggenberger.
     Propagation of uncertainty through multivariate functions in the framework of sets of probability measures.
     Reliability Engineering and System Safety, 85:73–87, 2004.
[49] Pierre Hansen, Brigitte Jaumard, Marcus Poggi de Aragão, Fabien Chauny, and Sylvain Perron.
     Probabilistic satisfiability with imprecise probabilities.
     Int. J. Approx. Reasoning, 24(2-3):171–189, 2000.
Decision
[50] Didier Dubois, Hélène Fargier, and Patrice Perny.
     Qualitative decision theory with preference relations and comparative uncertainty: An axiomatic approach.
     Artif. Intell., 148(1-2):219–260, 2003.
[51] P. Smets.
     Decision making in the tbm: the necessity of the pignistic transformation.
     I.J. of Approximate Reasoning, 38:133–147, 2005.
[52] M.C.M. Troffaes.
     Decision making under uncertainty using imprecise probabilities.
     Int. J. of Approximate Reasoning, 45:17–29, 2007.




   Sébastien Destercke (CNRS)                               SUM tutorial                                    SUM 2012   56 / 56

Contenu connexe

En vedette

παραδείγματα σύννεφα ετικετών
παραδείγματα  σύννεφα ετικετώνπαραδείγματα  σύννεφα ετικετών
παραδείγματα σύννεφα ετικετώνchristina_fil
 
Video door phone
Video door phoneVideo door phone
Video door phoneDe Cocoon
 
Information Fusion - Reconciliation data workshop
Information Fusion - Reconciliation data workshopInformation Fusion - Reconciliation data workshop
Information Fusion - Reconciliation data workshopSebastien Destercke
 
3 1 tipos de conhecimento
3 1 tipos de conhecimento3 1 tipos de conhecimento
3 1 tipos de conhecimentoArnaldo Aguiar
 
Startup agency
Startup agencyStartup agency
Startup agencyPhong Duy
 
miRQuest - How to use Tutorial
miRQuest - How to use TutorialmiRQuest - How to use Tutorial
miRQuest - How to use TutorialVinicius Coutinho
 
Home security system
Home security systemHome security system
Home security systemDe Cocoon
 
NRDR 2.0 Tutorial - The Non-Coding RNA Databases Resource Tutorial
NRDR 2.0 Tutorial - The Non-Coding RNA Databases Resource TutorialNRDR 2.0 Tutorial - The Non-Coding RNA Databases Resource Tutorial
NRDR 2.0 Tutorial - The Non-Coding RNA Databases Resource TutorialVinicius Coutinho
 
On combination and conflict - Belief function school lecture
On combination and conflict - Belief function school lectureOn combination and conflict - Belief function school lecture
On combination and conflict - Belief function school lectureSebastien Destercke
 
Imprecise probability theory - Summer School 2014
Imprecise probability theory - Summer School 2014Imprecise probability theory - Summer School 2014
Imprecise probability theory - Summer School 2014Sebastien Destercke
 
Cara kerja mesin fax
Cara kerja mesin faxCara kerja mesin fax
Cara kerja mesin faxguntarii
 
Cctv surveillance system
Cctv surveillance systemCctv surveillance system
Cctv surveillance systemDe Cocoon
 

En vedette (17)

παραδείγματα σύννεφα ετικετών
παραδείγματα  σύννεφα ετικετώνπαραδείγματα  σύννεφα ετικετών
παραδείγματα σύννεφα ετικετών
 
Video door phone
Video door phoneVideo door phone
Video door phone
 
Information Fusion - Reconciliation data workshop
Information Fusion - Reconciliation data workshopInformation Fusion - Reconciliation data workshop
Information Fusion - Reconciliation data workshop
 
3 1 tipos de conhecimento
3 1 tipos de conhecimento3 1 tipos de conhecimento
3 1 tipos de conhecimento
 
Serendib tea
Serendib teaSerendib tea
Serendib tea
 
Startup agency
Startup agencyStartup agency
Startup agency
 
miRQuest - How to use Tutorial
miRQuest - How to use TutorialmiRQuest - How to use Tutorial
miRQuest - How to use Tutorial
 
Home security system
Home security systemHome security system
Home security system
 
NRDR 2.0 Tutorial - The Non-Coding RNA Databases Resource Tutorial
NRDR 2.0 Tutorial - The Non-Coding RNA Databases Resource TutorialNRDR 2.0 Tutorial - The Non-Coding RNA Databases Resource Tutorial
NRDR 2.0 Tutorial - The Non-Coding RNA Databases Resource Tutorial
 
Madrid easy
Madrid easyMadrid easy
Madrid easy
 
1. pengetian ai
1. pengetian ai1. pengetian ai
1. pengetian ai
 
Pertemuan 1
Pertemuan 1Pertemuan 1
Pertemuan 1
 
On combination and conflict - Belief function school lecture
On combination and conflict - Belief function school lectureOn combination and conflict - Belief function school lecture
On combination and conflict - Belief function school lecture
 
Imprecise probability theory - Summer School 2014
Imprecise probability theory - Summer School 2014Imprecise probability theory - Summer School 2014
Imprecise probability theory - Summer School 2014
 
Cara kerja mesin fax
Cara kerja mesin faxCara kerja mesin fax
Cara kerja mesin fax
 
Cctv surveillance system
Cctv surveillance systemCctv surveillance system
Cctv surveillance system
 
Serendib tea
Serendib teaSerendib tea
Serendib tea
 

Similaire à Tutorial SUM 2012: some of the things you wanted to know about uncertainty (but were too busy to ask)

Belief functions and uncertainty theories
Belief functions and uncertainty theoriesBelief functions and uncertainty theories
Belief functions and uncertainty theoriesSebastien Destercke
 
Imprecision in learning: an overview
Imprecision in learning: an overviewImprecision in learning: an overview
Imprecision in learning: an overviewSebastien Destercke
 
Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401butest
 
Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401butest
 
Modeling Heterogeneity by Structural Varying Coefficients Models in Presence of...
Modeling Heterogeneity by Structural Varying Coefficients Models in Presence of...Modeling Heterogeneity by Structural Varying Coefficients Models in Presence of...
Modeling Heterogeneity by Structural Varying Coefficients Models in Presence of...Facultad de Ciencias Económicas UdeA
 
Lecture 1. Introduction
Lecture 1. IntroductionLecture 1. Introduction
Lecture 1. Introductionssuserb83554
 
Presentation on Machine Learning and Data Mining
Presentation on Machine Learning and Data MiningPresentation on Machine Learning and Data Mining
Presentation on Machine Learning and Data Miningbutest
 
Alpaydin - Chapter 2
Alpaydin - Chapter 2Alpaydin - Chapter 2
Alpaydin - Chapter 2butest
 
Using binary classifiers
Using binary classifiersUsing binary classifiers
Using binary classifiersbutest
 
UNIT1-2.pptx
UNIT1-2.pptxUNIT1-2.pptx
UNIT1-2.pptxcsecem
 
Alpaydin - Chapter 2
Alpaydin - Chapter 2Alpaydin - Chapter 2
Alpaydin - Chapter 2butest
 
Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401butest
 
Introduction to Statistical Machine Learning
Introduction to Statistical Machine LearningIntroduction to Statistical Machine Learning
Introduction to Statistical Machine Learningmahutte
 
Fuzzy Logic Ppt
Fuzzy Logic PptFuzzy Logic Ppt
Fuzzy Logic Pptrafi
 
Foundations of Machine Learning
Foundations of Machine LearningFoundations of Machine Learning
Foundations of Machine Learningmahutte
 
Tsovaltzi etal ectel2010
Tsovaltzi etal ectel2010Tsovaltzi etal ectel2010
Tsovaltzi etal ectel2010dtsovaltzi
 
Tsovaltzi etal ectel2010
Tsovaltzi etal ectel2010Tsovaltzi etal ectel2010
Tsovaltzi etal ectel2010dtsovaltzi
 

Similaire à Tutorial SUM 2012: some of the things you wanted to know about uncertainty (but were too busy to ask) (20)

Belief functions and uncertainty theories
Belief functions and uncertainty theoriesBelief functions and uncertainty theories
Belief functions and uncertainty theories
 
Imprecision in learning: an overview
Imprecision in learning: an overviewImprecision in learning: an overview
Imprecision in learning: an overview
 
Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401
 
Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401
 
Modeling Heterogeneity by Structural Varying Coefficients Models in Presence of...
Modeling Heterogeneity by Structural Varying Coefficients Models in Presence of...Modeling Heterogeneity by Structural Varying Coefficients Models in Presence of...
Modeling Heterogeneity by Structural Varying Coefficients Models in Presence of...
 
Lecture 1. Introduction
Lecture 1. IntroductionLecture 1. Introduction
Lecture 1. Introduction
 
Presentation on Machine Learning and Data Mining
Presentation on Machine Learning and Data MiningPresentation on Machine Learning and Data Mining
Presentation on Machine Learning and Data Mining
 
Eric Smidth
Eric SmidthEric Smidth
Eric Smidth
 
Technology teaching
Technology teachingTechnology teaching
Technology teaching
 
Alpaydin - Chapter 2
Alpaydin - Chapter 2Alpaydin - Chapter 2
Alpaydin - Chapter 2
 
Bayesian/Fiducial/Frequentist Uncertainty Quantification by Artificial Samples
Bayesian/Fiducial/Frequentist Uncertainty Quantification by Artificial SamplesBayesian/Fiducial/Frequentist Uncertainty Quantification by Artificial Samples
Bayesian/Fiducial/Frequentist Uncertainty Quantification by Artificial Samples
 
Using binary classifiers
Using binary classifiersUsing binary classifiers
Using binary classifiers
 
UNIT1-2.pptx
UNIT1-2.pptxUNIT1-2.pptx
UNIT1-2.pptx
 
Alpaydin - Chapter 2
Alpaydin - Chapter 2Alpaydin - Chapter 2
Alpaydin - Chapter 2
 
Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401
 
Introduction to Statistical Machine Learning
Introduction to Statistical Machine LearningIntroduction to Statistical Machine Learning
Introduction to Statistical Machine Learning
 
Fuzzy Logic Ppt
Fuzzy Logic PptFuzzy Logic Ppt
Fuzzy Logic Ppt
 
Foundations of Machine Learning
Foundations of Machine LearningFoundations of Machine Learning
Foundations of Machine Learning
 
Tsovaltzi etal ectel2010
Tsovaltzi etal ectel2010Tsovaltzi etal ectel2010
Tsovaltzi etal ectel2010
 
Tsovaltzi etal ectel2010
Tsovaltzi etal ectel2010Tsovaltzi etal ectel2010
Tsovaltzi etal ectel2010
 

Dernier

AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfphamnguyenenglishnb
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxAnupkumar Sharma
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxHumphrey A Beña
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Jisc
 
Karra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxKarra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxAshokKarra1
 
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...Postal Advocate Inc.
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designMIPLM
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptxmary850239
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
Q4 English4 Week3 PPT Melcnmg-based.pptx
Q4 English4 Week3 PPT Melcnmg-based.pptxQ4 English4 Week3 PPT Melcnmg-based.pptx
Q4 English4 Week3 PPT Melcnmg-based.pptxnelietumpap1
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxDr.Ibrahim Hassaan
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPCeline George
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceSamikshaHamane
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 

Dernier (20)

LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptxLEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
 
Raw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptxRaw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptx
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...
 
Karra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxKarra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptx
 
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-design
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
Q4 English4 Week3 PPT Melcnmg-based.pptx
Q4 English4 Week3 PPT Melcnmg-based.pptxQ4 English4 Week3 PPT Melcnmg-based.pptx
Q4 English4 Week3 PPT Melcnmg-based.pptx
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptx
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERP
 
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptxYOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in Pharmacovigilance
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 

Tutorial SUM 2012: some of the things you wanted to know about uncertainty (but were too busy to ask)

  • 1. Some of the things you wanted to know about uncertainty (and were too busy to ask) Sébastien Destercke Heudiasyc, CNRS Compiegne, France SUM 2012 Sébastien Destercke (CNRS) SUM tutorial SUM 2012 1 / 56
  • 2. Me, myself and I 2005-2008: PhD student Main topic: uncertainty modelling and treatment in nuclear safety 2009-2011: research engineer in agronomical research institute Main topic: aid-decision in agronomical production chains 2011- ?: researcher at Centre National de la Recherche Scientifique (CNRS), in the Heudiasyc joint unit research Main (current) topics: reliability analysis and machine learning Only common point: the modelling and handling of uncertainty (including an imprecision component) Sébastien Destercke (CNRS) SUM tutorial SUM 2012 2 / 56
  • 3. Tutorial goals and contents What you will find in this tutorial Mostly practical considerations about uncertainty An overview of "mainstream" uncertainty theories Elements and illustrations of their use to build or learn uncertainty representations make inference (and decision) A "personal" view about those things What you will not find in this tutorial A deep and exhaustive study of a particular topic Elements about other important problems (learning models, information fusion/revision) Sébastien Destercke (CNRS) SUM tutorial SUM 2012 3 / 56
  • 4. Introductory elements Plan 1 Introductory elements 2 How to represent uncertainty? 3 How to draw conclusions from information and decide? 4 Some final comments Sébastien Destercke (CNRS) SUM tutorial SUM 2012 4 / 56
  • 5. Introductory elements Section goals: it’s all about basics Introduce a basic framework Give basic ideas about uncertainty Introduce some basic problems Sébastien Destercke (CNRS) SUM tutorial SUM 2012 5 / 56
  • 6. Introductory elements A generic framework data (population): datum: ω {ω1 , . . . , ωn } + model singular information generic information model describes a relation in data space singular information: concern a particular situation/individual generic information: describe a general relationship, the behaviour of a population, . . . Sébastien Destercke (CNRS) SUM tutorial SUM 2012 6 / 56
  • 7. Introductory elements Uncertainty origins Uncertainty: inability to answer precisely a question about a quantity Can concern both: Singular information items in a data-base, values of some logical variables, time before failure of a component Generic information parameter values of classifiers/regression models, time before failure of components, truth of a logical sentence ("birds fly") Main origins Variability of a population → only concerns generic information Imprecision due to a lack of information Conflict between different sources of information (data/expert) Sébastien Destercke (CNRS) SUM tutorial SUM 2012 7 / 56
  • 8. Introductory elements Some examples data (population): datum: ω {ω1 , . . . , ωn } + model singular information generic information Classification Data space=input features X × (structured) classes Y model: classifier with parameters Uncertainty: mostly about model parameters Common problem: predict classes of individuals (singular information) Sébastien Destercke (CNRS) SUM tutorial SUM 2012 8 / 56
  • 9. Introductory elements Some examples data (population): datum: ω {ω1 , . . . , ωn } + model singular information generic information Risk and reliability analysis Data space=input variables X × output variable(s) Y Model: transfer/structure function f : X → Y Uncertainty: very often about X (sometimes f parameters) Common problem: obtain information about Y, either generic (failure of products) or singular (nuclear power plant) Sébastien Destercke (CNRS) SUM tutorial SUM 2012 8 / 56
  • 10. Introductory elements Some examples data (population): datum: ω {ω1 , . . . , ωn } + model singular information generic information Data mining/clustering Data space=data features Model: clusters, rules, . . . Uncertainty: mostly about model parameters Common problem: obtain the model from data {ω1 , . . . , ωn } Sébastien Destercke (CNRS) SUM tutorial SUM 2012 8 / 56
  • 11. Introductory elements Some examples data (population): datum: ω {ω1 , . . . , ωn } + model singular information generic information Data base querying Data space=data features Model: a query inducing preferences over observations Uncertainty: mostly about the query, sometimes data Common problem: retrieve and order interesting items in {ω1 , . . . , ωn } Sébastien Destercke (CNRS) SUM tutorial SUM 2012 8 / 56
  • 12. Introductory elements Some examples data (population): datum: ω {ω1 , . . . , ωn } + model singular information generic information Propositional logic Data space=set of possible interpretations Model: set of sentences of the language Uncertainty: on sentences or on the state of some atoms Common problem: deduce the uncertainty about the truth of a sentence S from facts and knowledge Sébastien Destercke (CNRS) SUM tutorial SUM 2012 8 / 56
  • 13. Introductory elements Handling uncertainty data (population): datum: ω {ω1 , . . . , ωn } + model singular information generic information Common problems in one sentence Learning: use singular information to estimate generic information Inference: interrogate model and observations to deduce information on quantity of interest Information fusion: merge multiple information pieces about same quantity Information revision: merge new information with old one Sébastien Destercke (CNRS) SUM tutorial SUM 2012 9 / 56
  • 14. How to represent uncertainty? Plan 1 Introductory elements 2 How to represent uncertainty? 3 How to draw conclusions from information and decide? 4 Some final comments Sébastien Destercke (CNRS) SUM tutorial SUM 2012 10 / 56
  • 15. How to represent uncertainty? Section goals Introduce main ideas of theories Provide elements about links between them Illustrate how to get uncertainty representations within each Sébastien Destercke (CNRS) SUM tutorial SUM 2012 11 / 56
  • 16. How to represent uncertainty? Basic framework Quantity S with possible exclusive states S = {s1 , . . . , sn } S: data feature, model parameter, . . . Basic tools A confidence degree µ : 2|S| → [0, 1] is such that µ(A): confidence S ∈ A µ(∅) = 0, µ(S) = 1 A ⊆ B ⇒ µ(A) ≤ µ(B) Uncertainty modelled by 2 degrees µ, µ : 2|S| → [0, 1]: µ(A) ≤ µ(A) (monotonicity) µ(A) = 1 − µ(Ac ) (duality) Sébastien Destercke (CNRS) SUM tutorial SUM 2012 12 / 56
  • 17. How to represent uncertainty? Probability Basic tool A probability distribution p : S → [0, 1] from which µ(A) = µ(A) = µ(A) = s∈A p(s) µ(A) = 1 − µ(Ac ): auto-dual Main interpretations Frequentist [3]: µ(A)= number of times A observed in a population only applies when THERE IS a population Subjectivist [1]: µ(A)= price for gamble giving 1 if A happens, 0 if not applies to singular situation and populations Sébastien Destercke (CNRS) SUM tutorial SUM 2012 13 / 56
  • 18. How to represent uncertainty? Probability and imprecision: short comment Probability often partially specified over S Probability on rest of S usually imprecise A small example S = {s1 , s2 , s3 , s4 } p(s1 ) = 0.1, p(s2 ) = 0.4 we deduce p(si ) ∈ [0, 0.5] for i = 3, 4 Sébastien Destercke (CNRS) SUM tutorial SUM 2012 14 / 56
  • 19. How to represent uncertainty? Probability and imprecision: short comment Probability often partially specified over S Probability on rest of S usually imprecise Another (logical) example q, r two propositional variables P(¬q ∨ r ) = α, P(q) = β we deduce P(r ) ∈ [β − 1 + α, α] Sébastien Destercke (CNRS) SUM tutorial SUM 2012 14 / 56
  • 20. How to represent uncertainty? Sets Basic tool A set E ⊆ S with true value S ∈ E from which E ⊆ A → µ(A) = µ(A) = 1 (certainty truth in A) E ∩ A = ∅, E ∩ Ac = ∅ → µ(A) = 0, µ(A) = 1 (ignorance) E ∩ A = ∅ → µ(A) = µ(A) = 0 (truth cannot be in A) µ, µ are binary → limited expressiveness Classical use of sets: Interval analysis [2] (E is a subset of R) Propositional logic (E is the set of models of a KB) Other cases: robust optimisation, decision under risk, . . . Sébastien Destercke (CNRS) SUM tutorial SUM 2012 15 / 56
  • 21. How to represent uncertainty? In summary Probabilities . . . (+) very informative quantification (do we need it?) (-) need lots of information (do we have it?) (-) if not enough, requires a choice (do we want to do that?) use probabilistic calculus (convolution, stoch. independence, . . . ) Sets . . . (+) need very few information (-) very rough quantification of uncertainty (Is it sufficient for us?) use set calculus (interval analysis, Cartesian product, . . . ) → Need of frameworks bridging these two Sébastien Destercke (CNRS) SUM tutorial SUM 2012 16 / 56
  • 22. How to represent uncertainty? Possibility theory Basic tool A distribution π : S → [0, 1], usually with si such that π(si ) = 1, from which µ(A) = maxs∈A π(s) (Possibility measure) µ(A) = 1 − µ(Ac ) = mins∈Ac (1 − π(s)) (Necessity measure) Sets E captured by π(s) = 1 if s ∈ E, 0 otherwise [µ, µ] as confidence degrees of possibility theory [9] bounds of an ill-known probability µ ⇒ µ ≤ µ ≤ µ [10] Sébastien Destercke (CNRS) SUM tutorial SUM 2012 17 / 56
  • 23. How to represent uncertainty? A nice characteristic: Alpha-cut [5] Definition Aα = {s ∈ S|π(s) ≥ α} µ(Aα ) = 1 − α If β ≤ α, Aα ⊆ Aβ Simulation: draw α ∈ [0, 1] and associate Aα 1 Aβ π β Aα α S ⇒ Possibilistic approach ideal to model nested structures Sébastien Destercke (CNRS) SUM tutorial SUM 2012 18 / 56
  • 24. How to represent uncertainty? A basic distribution: simple support A set E of most plausible values A confidence degree α = µ(E) Two interesting cases: pH value ∈ [4.5, 5.5] with Expert providing most α = 0.5 (∼ "more probable than") plausible values E π 1.0 E set of models of a formula φ 0.8 Both cases extend to multiple 0.6 sets E1 , . . . , Ep : 0.4 0.2 confidence degrees over 0 nested sets [36] 3 4 4.5 5.5 6 7 hierarchical knowledge bases [33] Sébastien Destercke (CNRS) SUM tutorial SUM 2012 19 / 56
  • 25. How to represent uncertainty? A basic distribution: simple support variables p, q A set E of most plausible values Ω = {pq, ¬pq, p¬q, ¬p¬q} A confidence degree α = µ(E) µ(p ⇒ q) = 0.9 Two interesting cases: (∼ "almost certain") Expert providing most E = {pq, p¬q, ¬p¬q} plausible values E E set of models of a formula φ π(pq) = π(p¬q) = π(¬p¬q) = 1 Both cases extend to multiple π(¬pq) = 0.1 sets E1 , . . . , Ep : 1.0 confidence degrees over 0.8 nested sets [36] 0.6 hierarchical knowledge bases 0.4 0.2 [33] 0 pq p¬q ¬pq ¬p¬q Sébastien Destercke (CNRS) SUM tutorial SUM 2012 19 / 56
  • 26. How to represent uncertainty? Normalized likelihood as possibilities [8] [26] π 1 π(θ) = L(θ|x)/maxθ∈Θ L(θ|x) Binomial situation: θ = success probability x number of observed successes x= 4 succ. out of 11 4/11 θ x= 20 succ. out of 55 Sébastien Destercke (CNRS) SUM tutorial SUM 2012 20 / 56
  • 27. How to represent uncertainty? Partially specified probabilities [25] [32] Triangular distribution: [µ, µ] encompass all probabilities with 1 mode/reference value M π support domain [a, b]. Getting back to pH M=5 3 5 7 pH [a, b] = [3, 7] Sébastien Destercke (CNRS) SUM tutorial SUM 2012 21 / 56
  • 28. How to represent uncertainty? Other examples Statistical inequalities (e.g., Chebyshev inequality) [32] Linguistic information (fuzzy sets) [28] Approaches based on nested models Sébastien Destercke (CNRS) SUM tutorial SUM 2012 22 / 56
  • 29. How to represent uncertainty? Possibility: limitations µ(A) > 0 ⇒ µ(A) = 1 µ(A) < 1 ⇒ µ(A) = 0 ⇒ interval [µ(A), µ(A)] with one trivial bound Does not include probabilities as special case: ⇒ possibility and probability at odds ⇒ respective calculus hard (sometimes impossible?) to reconcile Sébastien Destercke (CNRS) SUM tutorial SUM 2012 23 / 56
  • 30. How to represent uncertainty? Going beyond Extend the theory ⇒ by complementing π with a lower distribution δ (δ ≤ π ) [11], [31] ⇒ by working with interval-valued possibility/necessity degrees [4] ⇒ by working with sets of possibility measures [7] Use a more general model ⇒ Random sets and belief functions Sébastien Destercke (CNRS) SUM tutorial SUM 2012 24 / 56
  • 31. How to represent uncertainty? Random sets and belief functions Basic tool A positive distribution m : 2|S| → [0, 1], with E m(E) = 1 and usually m(∅ = 0), from which µ(A) = E∩A=∅ m(E) (Plausibility measure) µ(A) = E⊆A m(E) = 1 − µ(Ac ) (Belief measure) m(E1 ) m(E2 ) µ(A) = m(E1 ) + m(E2 ) m(E3 ) m(E4 ) µ(A) = m(E1 ) + m(E2 ) + m(E5 ) m(E3 ) + m(E5 ) A [µ, µ] as confidence degrees of evidence theory [16], [17] bounds of an ill-known probability µ ⇒ µ ≤ µ ≤ µ [14] Sébastien Destercke (CNRS) SUM tutorial SUM 2012 25 / 56
  • 32. How to represent uncertainty? special cases Measures [µ, µ] include: Probability distributions: mass on atoms/singletons Possibility distributions: mass on nested sets E4 E3 E2 E1 Sébastien Destercke (CNRS) SUM tutorial SUM 2012 26 / 56
  • 33. How to represent uncertainty? Frequencies of imprecise observations Imprecise poll: "Who will win the next Wimbledon tournament?" N(adal) F(ederer) D(jokovic) M(urray) O(ther) 60 % replied {N, F , D} → m({N, F , D}) = 0.6 15 % replied "I do not know" {N, F , D, M, O} → m(S) = 0.15 10 % replied Murray {M} → m({M}) = 0.1 5 % replied others {O} → m({O}) = 0.05 ... Sébastien Destercke (CNRS) SUM tutorial SUM 2012 27 / 56
  • 34. How to represent uncertainty? P-box [34] Expert providing percentiles A pair [F , F ] of cumulative 0 ≤ P([−∞, 12]) ≤ 0.2 distributions 0.2 ≤ P([−∞, 24]) ≤ 0.4 Bounds over events [−∞, x] 0.6 ≤ P([−∞, 36]) ≤ 0.8 Percentiles by experts; 1.0 E5 Kolmogorov-Smirnov bounds; E4 Can be extended to any 0.5 E3 E2 pre-ordered space [30], [37] ⇒ E1 multivariate spaces! 6 12 18 24 30 36 42 Sébastien Destercke (CNRS) SUM tutorial SUM 2012 28 / 56
  • 35. How to represent uncertainty? Other means to get random sets/belief functions Extending modal logic: probability of provability [18] Parameter estimation using pivotal quantities [15] Statistical confidence regions [29] Modify source information by its reliability [35] ... Sébastien Destercke (CNRS) SUM tutorial SUM 2012 29 / 56
  • 36. How to represent uncertainty? Limits of random sets Not yet satisfactory extension of Bayesian/subjective approach Still some items of information it cannot model in a simple way, e.g., probabilistic bounds over atoms si (imprecise histograms, . . . ) [27]; comparative assessments such as 2P(B) ≤ P(A) 6 12 18 24 30 36 42 Sébastien Destercke (CNRS) SUM tutorial SUM 2012 30 / 56
  • 37. How to represent uncertainty? Imprecise probabilities Basic tool A set P of probabilities on S or an equivalent representation µ(A) = supP∈P P(A) (Upper probability) µ(A) = infP∈P P(A) = 1 − µ(Ac ) (Lower probability) [µ, µ] as subjective lower and upper betting rates [23] bounds of an ill-known probability measure µ ⇒ µ ≤ µ ≤ µ [19] [24] Sébastien Destercke (CNRS) SUM tutorial SUM 2012 31 / 56
  • 38. How to represent uncertainty? Means to get Imprecise probabilistic models Include all representations seen so far . . . . . . and a couple of others probabilistic comparisons density ratio-class expectation bounds ... fully coherent extension of Bayesian approach P(θ|x) = L(θ|x)P(θ) → often easy for "conjugate prior" [22] make probabilistic logic approaches imprecise [21, 20] Sébastien Destercke (CNRS) SUM tutorial SUM 2012 32 / 56
  • 39. How to represent uncertainty? A crude summary Possibility distributions +: very simple, natural in many situations (nestedness), extend set-based approach -: at odds with probability theory, limited expressiveness Random sets +: include probabilities and possibilities, include many models used in practice -: general models can be intractable, limited expressiveness Imprecise probabilities +: most consistent extension of probabilistic approach, very flexible -: general models can be intractable Sébastien Destercke (CNRS) SUM tutorial SUM 2012 33 / 56
  • 40. How to represent uncertainty? A not completely accurate but useful picture Able to model variability Incompleteness tolerant General tractability(scalability) Imprecise probability Expressivity/flexibility Random sets Probability Possibility Sets Sébastien Destercke (CNRS) SUM tutorial SUM 2012 34 / 56
  • 41. How to draw conclusions from information and decide? Plan 1 Introductory elements 2 How to represent uncertainty? 3 How to draw conclusions from information and decide? 4 Some final comments Sébastien Destercke (CNRS) SUM tutorial SUM 2012 35 / 56
  • 42. How to draw conclusions from information and decide? Section goals Introduce the inference problem Introduce the notion of joint models Introduce how (basic) decision can be done Give some basic illustrations, mainly from regression/classification/reliability Sébastien Destercke (CNRS) SUM tutorial SUM 2012 36 / 56
  • 43. How to draw conclusions from information and decide? The problem data (population): datum: ω {ω1 , . . . , ωn } + model singular information generic information uncertain Input: marginal pieces of information on a part of the data space and the model Step 1: build a joint model from marginal information Step 2: deduce information (by propagation, conditioning, . . . ) on data Sébastien Destercke (CNRS) SUM tutorial SUM 2012 37 / 56
  • 44. How to draw conclusions from information and decide? Closeness requirement data (population): datum: ω {ω1 , . . . , ωn } + model singular information generic information partial/marginal pieces of information are x joint model is x deduced information is x where x ∈ {Prob. distribution, Poss. distribution, Belief function, Prob. set} Sébastien Destercke (CNRS) SUM tutorial SUM 2012 38 / 56
  • 45. How to draw conclusions from information and decide? Straight ahead y = ax + b No uncertainty: a = 2.5, b = 3.5 → joint: 2.5 × 3.5 Infer y when x = 3 y 20 10 1 2 3 4x Sébastien Destercke (CNRS) SUM tutorial SUM 2012 39 / 56
  • 46. How to draw conclusions from information and decide? Straight ahead y = ax + b Imprecision: a ∈ [2, 3], b ∈ [3, 4] → joint: [2, 3] × [3, 4] Infer y when x = 3 y 20 10 1 2 3 4x Sébastien Destercke (CNRS) SUM tutorial SUM 2012 39 / 56
  • 47. How to draw conclusions from information and decide? Joint models: possibilistic illustration 1 π π 2 5 4 3 b 1 1 π a 1 2 3 4 a b Sébastien Destercke (CNRS) SUM tutorial SUM 2012 40 / 56
  • 48. How to draw conclusions from information and decide? Fuzzy straight ahead y = ax + b Possibilistic uncertainty on a and b Infer y when x = 3 y 20 10 1 2 3 4x Sébastien Destercke (CNRS) SUM tutorial SUM 2012 41 / 56
  • 49. How to draw conclusions from information and decide? Fuzzy straight ahead y = ax + b Possibilistic uncertainty on a and b Infer y when x = 3 y 20 10 1 2 3 4x Sébastien Destercke (CNRS) SUM tutorial SUM 2012 41 / 56
  • 50. How to draw conclusions from information and decide? Reliable or not? Model: structure function φ : C1 × C2 → S p(0)=0.1 p(1)=0.9 C1 : {0, 1} p(0 × 0) = 0.01 p(0 × 1) = 0.09 p(1 × 0) = 0.09 S : {0, 1} p(1 × 1) = 0.81 p(0) = 0.01 p(1) = 0.99 p(0)=0.1 p(1)=0.9 C2 : {0, 1} Sébastien Destercke (CNRS) SUM tutorial SUM 2012 42 / 56
  • 51. How to draw conclusions from information and decide? Reliable or not? Model: structure function φ : C1 × C2 → S m({0}) = 0.05 m({1}) = 0.75 C1 : {0, 1} m({0, 1}) = 0.2 m({0} × {0}) = 0.025 m({1} × {1}) = 0.5625 m({0} × {0, 1}) = 0.01 S : {0, 1} m({0, 1} × {0, 1}) = 0.04 ... m({0}) = 0.0025 m({0}) = 0.05 m({1}) = 0.9575 m({1}) = 0.75 C2 : {0, 1} m({0, 1}) = 0.04 m({0, 1}) = 0.2 Sébastien Destercke (CNRS) SUM tutorial SUM 2012 42 / 56
  • 52. How to draw conclusions from information and decide? Two kinds of decision Binary: whether to take an action or not risk/reliability analysis (take the risk or not) logic (decide if a sentence is true) binary classification Non-binary: decide among multiple choices classification control, planing, . . . Introducing imprecision allowing for incomparability Sébastien Destercke (CNRS) SUM tutorial SUM 2012 43 / 56
  • 53. How to draw conclusions from information and decide? Binary case A threshold τ , two decisions {1, −1} τ µ(a) 0 1 Decide {−1} Decide {1} Decision: -1 Sébastien Destercke (CNRS) SUM tutorial SUM 2012 44 / 56
  • 54. How to draw conclusions from information and decide? Binary case A threshold τ , two decisions {1, −1} τ µ(a) µ(a) 0 1 Decide {−1} Decide {1} Decision: -1 Sébastien Destercke (CNRS) SUM tutorial SUM 2012 44 / 56
  • 55. How to draw conclusions from information and decide? Binary case A threshold τ , two decisions {1, −1} τ µ(a) µ(a) 0 1 Decide {−1} Decide {1} No winner : -1,1 Maximin : -1 Maximax : 1 Sébastien Destercke (CNRS) SUM tutorial SUM 2012 44 / 56
  • 56. How to draw conclusions from information and decide? Multiple choice case µ 5 4 3 2 1 y1 y2 y3 y4 y5 Y Sébastien Destercke (CNRS) SUM tutorial SUM 2012 45 / 56
  • 57. How to draw conclusions from information and decide? Multiple choice case µ 5 4 s Dominate Do mi na tes 3 Dominates 2 1 Dominance: {y1 , y3 , y5 } y1 y2 y3 y4 y5 Y Sébastien Destercke (CNRS) SUM tutorial SUM 2012 45 / 56
  • 58. How to draw conclusions from information and decide? Multiple choice case µ 5 4 3 2 1 Maximin: {y3 } y1 y2 y3 y4 y5 Y Sébastien Destercke (CNRS) SUM tutorial SUM 2012 45 / 56
  • 59. How to draw conclusions from information and decide? Multiple choice case µ 5 4 3 2 1 Maximax: {y1 } y1 y2 y3 y4 y5 Y Sébastien Destercke (CNRS) SUM tutorial SUM 2012 45 / 56
  • 60. Some final comments Plan 1 Introductory elements 2 How to represent uncertainty? 3 How to draw conclusions from information and decide? 4 Some final comments Sébastien Destercke (CNRS) SUM tutorial SUM 2012 46 / 56
  • 61. Some final comments Why modelling uncertainty (outside intellectual satisfaction)? Because . . . . . . you should (risk/reliability analysis) . . . it solves existing issues (non-monotonic reasoning) . . . it gives better/more robust results with acceptable computational burden Sébastien Destercke (CNRS) SUM tutorial SUM 2012 47 / 56
  • 62. Some final comments Scalability Adding flexibility to the model → increases scalability issue already true for probability and intervals only get worse if model more complex How to solve it? As in other domains approximation, model reduction, . . . → make things as simple as possible (but not simpler) to answer your question sampling use flexibility only where you need it Sébastien Destercke (CNRS) SUM tutorial SUM 2012 48 / 56
  • 63. Some final comments One advantage of incompleteness Using approximations: choice between outer/inner approximation x1 x2 x3 Sébastien Destercke (CNRS) SUM tutorial SUM 2012 49 / 56
  • 64. Some final comments References I General bibliography [1] B.de Finetti. Theory of probability, volume 1-2. Wiley, NY, 1974. Translation of 1970 book. [2] L. Jaulin, M. Kieffer, O. Didrit, and E. Walter. Applied Interval Analysis. London, 2001. [3] R. von Mises. Probability, Statistics and Truth. Dover books explaining science. Dover Publications, 1981. Possibility theory [4] Salem Benferhat, Julien Hué, Sylvain Lagrue, and Julien Rossit. Interval-based possibilistic logic. In IJCAI, pages 750–755, 2011. [5] I. Couso, S. Montes, and P. Gil. The necessity of the strong alpha-cuts of a fuzzy set. Int. J. on Uncertainty, Fuzziness and Knowledge-Based Systems, 9:249–262, 2001. [6] G. de Cooman and D. Aeyels. Supremum-preserving upper probabilities. Information Sciences, 118:173–212, 1999. [7] D. Dubois. Fuzzy measures on finite scales as families of possibility measures. In Proc. European Society for Fuzzy Logic and Technology conference, 2011. Sébastien Destercke (CNRS) SUM tutorial SUM 2012 50 / 56
  • 65. Some final comments References II [8] D. Dubois, S. Moral, and H. Prade. A semantics for possibility theory based on likelihoods,. Journal of Mathematical Analysis and Applications, 205(2):359 – 380, 1997. [9] D. Dubois and H. Prade. Possibility Theory: An Approach to Computerized Processing of Uncertainty. Plenum Press, New York, 1988. [10] D. Dubois and H. Prade. When upper probabilities are possibility measures. Fuzzy Sets and Systems, 49:65–74, 1992. [11] Didier Dubois and Henri Prade. An overview of the asymmetric bipolar representation of positive and negative information in possibility theory. Fuzzy Sets and Systems, 160(10):1355–1366, 2009. [12] G. L. S. Shackle. Decision, Order and Time in Human Affairs. Cambridge University Press, Cambridge, 1961. [13] L.A. Zadeh. Fuzzy sets as a basis for a theory of possibility. Fuzzy Sets and Systems, 1:3–28, 1978. Random sets and belief functions [14] A.P. Dempster. Upper and lower probabilities induced by a multivalued mapping. Annals of Mathematical Statistics, 38:325–339, 1967. [15] Ryan Martin, Jianchun Zhang, and Chuanhai Liu. Dempster-shafer theory and statistical inference with weak beliefs. Technical report, 2008. Sébastien Destercke (CNRS) SUM tutorial SUM 2012 51 / 56
  • 66. Some final comments References III [16] G. Shafer. A mathematical Theory of Evidence. Princeton University Press, New Jersey, 1976. [17] P. Smets. The transferable belief model and other interpretations of dempster-shafer’s model. In Proc. of the Sixth Annual Confernce on Uncertainty in Artifical Intelligence, pages 375–384, 1990. [18] Philippe Smets. Probability of provability and belief functions. Logique et Analyse, 133-134:177–195, 1991. Imprecise probability [19] J. O. Berger. An overview of robust Bayesian analysis. Test, 3:5–124, 1994. With discussion. [20] Gert de Cooman. Belief models: An order-theoretic investigation. Ann. Math. Artif. Intell., 45(1-2):5–34, 2005. [21] Pierre Hansen, Brigitte Jaumard, Marcus Poggi de Aragão, Fabien Chauny, and Sylvain Perron. Probabilistic satisfiability with imprecise probabilities. Int. J. Approx. Reasoning, 24(2-3):171–189, 2000. [22] Erik Quaeghebeur and Gert de Cooman. Imprecise probability models for inference in exponential families. In ISIPTA, pages 287–296, 2005. Sébastien Destercke (CNRS) SUM tutorial SUM 2012 52 / 56
  • 67. Some final comments References IV [23] P. Walley. Statistical reasoning with imprecise Probabilities. Chapman and Hall, New York, 1991. [24] Kurt Weichselberger. The theory of interval-probability as a unifying concept for uncertainty. International Journal of Approximate Reasoning, 24(2–3):149 – 170, 2000. Practical representations [25] C. Baudrit and D. Dubois. Practical representations of incomplete probabilistic knowledge. Computational Statistics and Data Analysis, 51(1):86–108, 2006. [26] M. Cattaneo. Likelihood-based statistical decisions. In Proc. 4th International Symposium on Imprecise Probabilities and Their Applications, pages 107–116, 2005. [27] L.M. de Campos, J.F. Huete, and S. Moral. Probability intervals: a tool for uncertain reasoning. I. J. of Uncertainty, Fuzziness and Knowledge-Based Systems, 2:167–196, 1994. [28] G. de Cooman and P. Walley. A possibilistic hierarchical model for behaviour under uncertainty. Theory and Decision, 52:327–374, 2002. [29] T. Denoeux. Constructing belief functions from sample data using multinomial confidence regions. I. J. of Approximate Reasoning, 42:228–252, 2006. [30] S. Destercke, D. Dubois, and E. Chojnacki. Unifying practical uncertainty representations: I generalized p-boxes. Int. J. of Approximate Reasoning, 49:649–663, 2008. Sébastien Destercke (CNRS) SUM tutorial SUM 2012 53 / 56
  • 68. Some final comments References V [31] S. Destercke, D. Dubois, and E. Chojnacki. Unifying practical uncertainty representations: II clouds. Int. J. of Approximate Reasoning (in press), pages 664–677, 2008. [32] D. Dubois, L. Foulloy, G. Mauris, and H. Prade. Probability-possibility transformations, triangular fuzzy sets, and probabilistic inequalities. Reliable Computing, 10:273–297, 2004. [33] Didier Dubois and Henri Prade. Possibilistic logic: a retrospective and prospective view. Fuzzy Sets and Systems, 144(1):3 – 23, 2004. [34] S. Ferson, L. Ginzburg, V. Kreinovich, D.M. Myers, and K. Sentz. Constructing probability boxes and dempster-shafer structures. Technical report, Sandia National Laboratories, 2003. [35] Frédéric Pichon, Didier Dubois, and Thierry Denoeux. Relevance and truthfulness in information correction and fusion. Int. J. Approx. Reasoning, 53(2):159–175, 2012. [36] S.A. Sandri, D. Dubois, and H.W. Kalfsbeek. Elicitation, assessment and pooling of expert judgments using possibility theory. IEEE Trans. on Fuzzy Systems, 3(3):313–335, August 1995. [37] Matthias C. M. Troffaes and Sébastien Destercke. Probability boxes on totally preordered spaces for multivariate modelling. Int. J. Approx. Reasoning, 52(6):767–791, 2011. Independence notions [38] B. Bouchon-Meunier, G. Coletti, and C. Marsala. Independence and possibilistic conditioning. Annals of Mathematics and Artificial Intelligence, 35:107–123, 2002. Sébastien Destercke (CNRS) SUM tutorial SUM 2012 54 / 56
  • 69. Some final comments References VI [39] I. Couso and S. Moral. Independence concepts in evidence theory. International Journal of Approximate Reasoning, 51:748–758, 2010. [40] I. Couso, S. Moral, and P. Walley. A survey of concepts of independence for imprecise probabilities. Risk Decision and Policy, 5:165–181, 2000. [41] L. de Campos and J. Huete. Independence concepts in possibility theory: Part ii. Fuzzy Sets and Systems, 103:487–505, 1999. [42] G. de Cooman. Possibility theory III: possibilistic independence. International Journal of General Systems, 25:353–371, 1997. [43] G. de Cooman and E. Miranda. Independent natural extension for sets of desirable gambles. In F. Coolen, G. de Cooman, T. Fetz, and M. Oberguggenberger, editors, ISIPTA ’11: Proceedings of the Seventh International Symposium on Imprecise Probabilities: Theories and Applications, pages 169–178, Innsbruck, 2011. Action M Agency for SIPTA. [44] G. de Cooman, E. Miranda, and M. Zaffalon. Independent natural extension. Artificial Intelligence, 174:1911–1950, 2011. [45] D. Dubois, L. Farinas del Cerro, A. Herzig, and H. Prade. A roadmap of qualitative independence. In D. Dubois, H. Prade, and E.P. Klement, editors, Fuzzy sets, logics, and reasoning about knowledge. Springer, 1999. Sébastien Destercke (CNRS) SUM tutorial SUM 2012 55 / 56
  • 70. Some final comments References VII [46] B. Ben Yaghlane, P. Smets, and K. Mellouli. Belief function independence: I. the marginal case. I. J. of Approximate Reasoning, 29(1):47–70, 2002. Inference [47] C. Baudrit, I. Couso, and D. Dubois. Joint propagation of probability and possibility in risk analysis: towards a formal framework. Int. J. of Approximate Reasoning, 45:82–105, 2007. [48] T. Fetz and M. Oberguggenberger. Propagation of uncertainty through multivariate functions in the framework of sets of probability measures. Reliability Engineering and System Safety, 85:73–87, 2004. [49] Pierre Hansen, Brigitte Jaumard, Marcus Poggi de Aragão, Fabien Chauny, and Sylvain Perron. Probabilistic satisfiability with imprecise probabilities. Int. J. Approx. Reasoning, 24(2-3):171–189, 2000. Decision [50] Didier Dubois, Hélène Fargier, and Patrice Perny. Qualitative decision theory with preference relations and comparative uncertainty: An axiomatic approach. Artif. Intell., 148(1-2):219–260, 2003. [51] P. Smets. Decision making in the tbm: the necessity of the pignistic transformation. I.J. of Approximate Reasoning, 38:133–147, 2005. [52] M.C.M. Troffaes. Decision making under uncertainty using imprecise probabilities. Int. J. of Approximate Reasoning, 45:17–29, 2007. Sébastien Destercke (CNRS) SUM tutorial SUM 2012 56 / 56