Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Learning possibilistic networks from data: a survey
1. Introduction and context State of the art Conclusion & Perspectives
Learning possibilistic networks from data: a survey
Maroua Haddad1;2, Nahla Ben Amor1 and Philippe Leray2
1 Laboratoire de Recherche Opérationnelle de Décision et de Contrôle de
Processus (LARODEC), ISG Tunis, Tunisie
2 Laboratoire d’Informatique de Nantes Atlantique (LINA), UMR CNRS 6241,
Université de Nantes, France
Maroua Haddad Learning possibilistic networks from data 1/21
2. Introduction and context State of the art Conclusion & Perspectives
Outline ...
1. Introduction and context
2. State of the art
2.1. Possibilistic network, inference, sampling
2.2. Parameter learning
2.3. Structure learning
3. Conclusion & Perspectives
Maroua Haddad Learning possibilistic networks from data 2/21
3. Introduction and context State of the art Conclusion & Perspectives
Background
Possibility theory [Zadeh, 1978], [Dubois and Prade, 1980]
another uncertainty theory
completing probability theory in order to handle uncertain,
imprecise and missing information
Maroua Haddad Learning possibilistic networks from data 3/21
4. Introduction and context State of the art Conclusion & Perspectives
Background
Possibility theory [Zadeh, 1978], [Dubois and Prade, 1980]
Possibility distribution
! [0; 1]
max is equal to 1, not
the integral
R
π
1
0
0.6
x
Maroua Haddad Learning possibilistic networks from data 3/21
5. Introduction and context State of the art Conclusion Perspectives
Background
Possibility theory [Zadeh, 1978], [Dubois and Prade, 1980]
Possibility distribution
! [0; 1]
max is equal to 1, not
the integral
R
π
1
0
0.6
x
Possibility measure
is A coherent with ?
(A) = max!A (!)
Necessity measure N
is A certainly implied by ?
N(A) = 1 − (¬A)
Maroua Haddad Learning possibilistic networks from data 3/21
6. Introduction and context State of the art Conclusion Perspectives
Background
Possibilistic scale
two distinct understandings of a possibility distribution
numerical interpretation (PROD)
a possibility distribution may encode a piece of imprecise
knowledge about a situation, as in approximate reasoning
(Quantitative possibility theory)
Maroua Haddad Learning possibilistic networks from data 4/21
7. Introduction and context State of the art Conclusion Perspectives
Background
Possibilistic scale
two distinct understandings of a possibility distribution
numerical interpretation (PROD)
a possibility distribution may encode a piece of imprecise
knowledge about a situation, as in approximate reasoning
(Quantitative possibility theory)
ordinal interpretation (MIN)
where the only important information is the ordering over
possibility values rather than their exact values, usually in
order to describe some user preferences
(Qualitative possibility theory)
Maroua Haddad Learning possibilistic networks from data 4/21
8. Introduction and context State of the art Conclusion Perspectives
Background
R
π
1
0
π1
π2
Total ignorance
π3
Complete knowledge
Specificity
1 more specific than 2, 2 more specific than 3
Maroua Haddad Learning possibilistic networks from data 5/21
9. Introduction and context State of the art Conclusion Perspectives
Background
R
π
1
0
π1
π2
Total ignorance
π3
Complete knowledge
Specificity
1 more specific than 2, 2 more specific than 3
from complete knowledge
Maroua Haddad Learning possibilistic networks from data 5/21
10. Introduction and context State of the art Conclusion Perspectives
Background
R
π
1
0
π1
π2
Total ignorance
π3
Complete knowledge
Specificity
1 more specific than 2, 2 more specific than 3
from complete knowledge
to total ignorance
Maroua Haddad Learning possibilistic networks from data 5/21
11. Introduction and context State of the art Conclusion Perspectives
Learning from data
C S R W
c1 s2 r3 w1
c3 s1 r2 w2
c1 s2 r3 w3
c2 s1 r2 w3
c2 s2 r2 w1
c1 s2 r1 w1
c2 s1 r2 w3
c1 s2 r3 w3
c3 s1 r3 w2
c2 s1 r1 w3
c2 s1 r2 w1
Which kind of data ?
certain data
imprecise data
possibilistic data
Maroua Haddad Learning possibilistic networks from data 6/21
12. Introduction and context State of the art Conclusion Perspectives
Learning from data
C S R W
c1 s2 r3 w1
c3 s1 r2 w2
c1 s2 r3 w3
c2 s1,s2 r2 w2,w3
c2 s2 r1,r2 w1
c1 s2 r1
c2 s1 r2 w3
c1 s2 r3 w3
c3 s1 r3 w2
c2 s1 r2,r3 w3
c2 s1,s2 r2 w1,w2,w3
Which kind of data ?
certain data
imprecise data
possibilistic data
Maroua Haddad Learning possibilistic networks from data 7/21
13. Introduction and context State of the art Conclusion Perspectives
Learning from data
C S R W
c1 s2 r3 w1
c3 s1 r2 w2
c1 s2 r3 w3
c2 [1,0.8] r2 [1,1,1]
c2 s2 [1; 1; 1] w1
c1 s2 r1 [0.2,0.5,1]
c2 s1 r2 w3
c1 s2 r3 w3
c3 s1 r3 w2
c2 s1 [0.1,0.9,1] w3
c2 [0.2,1] r2 [1,1,1]
Which kind of data ?
certain data
imprecise data
possibilistic data
Maroua Haddad Learning possibilistic networks from data 8/21
14. Introduction and context State of the art Conclusion Perspectives
Outline ...
1. Introduction and context
2. State of the art
2.1. Possibilistic network, inference, sampling
2.2. Parameter learning
2.3. Structure learning
3. Conclusion Perspectives
Maroua Haddad Learning possibilistic networks from data 9/21
15. Introduction and context State of the art Conclusion Perspectives
Possibilistic networks [Fonck, 1994]
Π (C)
Cloudy
Π (S|C) Π (R|C)
Sprinkler Rain
Wet
Grass
Π (W|SR)
Product-based possibilistic networks
(!SpA) =
¢¨¨¦¨¨¤
(!)
(A) if ! A
0 otherwise:
Min-based possibilistic networks
(!SmA) =
¢¨¨¨¦¨¨¨¤
1 if (!) = (A) and ! A
(!) if (!) (A) and ! A
0 otherwise:
Possibilistic chain rule
(V1; ::;VN) = ai=1::N(Vi SPa(Vi ))
Maroua Haddad Learning possibilistic networks from data 10/21
16. Introduction and context State of the art Conclusion Perspectives
Possibilistic inference
Possibilistic inference
junction tree [Fonck, 1992] [Borgelt and Kruse, 1998] PROD
MIN
anytime propagation [Ben Amor et al., 2003] MIN
compilation techniques [Ayachi et al., 2010] PROD MIN
loopy belief propagation [Ajroud and Benferhat, 2014] MIN
Maroua Haddad Learning possibilistic networks from data 11/21
17. Introduction and context State of the art Conclusion Perspectives
Simulation
Sampling
in the Quantitative possibilistic framework
by combining Monte Carlo random sampling and -cuts
certain data [Chanas and Nowakowski, 1988]
imprecise data [Guyonnet, 2003]
Maroua Haddad Learning possibilistic networks from data 12/21
18. Introduction and context State of the art Conclusion Perspectives
Simulation
Sampling
in the Quantitative possibilistic framework
by combining Monte Carlo random sampling and -cuts
certain data [Chanas and Nowakowski, 1988]
imprecise data [Guyonnet, 2003]
and for a PROD possibilistic network ?
forward sampling (used for BNs) seems ok for certain data
but how to deal with imprecise data, i.e. sampling data from
Xi when its parents don’t have a certain value ?
Maroua Haddad Learning possibilistic networks from data 12/21
19. Introduction and context State of the art Conclusion Perspectives
Parameter learning
Objective
for a given structure, how can we estimate (Xi SPa(Xi )) ?
satisfying Maximum Uncertainty Principle (MUP) [Klir, 1990]
When a problem solution is undetermined, the possible
solution with the highest uncertainty should be chosen
in Possibility theory : Minimize Non-Specificity
Maroua Haddad Learning possibilistic networks from data 13/21
20. Introduction and context State of the art Conclusion Perspectives
... by using Probability-Possibility transformation
Direct transformations
several existing transformations with different properties
[Klir and Parviz, 1992], [Dubois et al., 1993, 2004],
[Mouchaweh et al., 2006], [Bouguelid, 2007]
applicable to the joint possibility distribution
Maroua Haddad Learning possibilistic networks from data 14/21
21. Introduction and context State of the art Conclusion Perspectives
... by using Probability-Possibility transformation
Direct transformations
Parameter learning ?
certain data joint probability distribution
transformation into a joint possibility distribution
then marginalization in order to find the conditional possibility
distributions
Maroua Haddad Learning possibilistic networks from data 14/21
22. Introduction and context State of the art Conclusion Perspectives
... by using Probability-Possibility transformation
Direct transformations
Parameter learning ?
certain data joint probability distribution
transformation into a joint possibility distribution
then marginalization in order to find the conditional possibility
distributions
Inconvenients
working with the joint distributions is not efficient
supposing that the probability estimation is perfect is not
realistic
Maroua Haddad Learning possibilistic networks from data 14/21
23. Introduction and context State of the art Conclusion Perspectives
... by using Probability-Possibility transformation
Confidence interval
[De Campos and Huete, 2001]
mixture of [Klir and Parviz, 1992] and [Dubois et al., 1993,
2004] transformations
deals with 2 probability distributions min and max.
, applicable to local conditional possibility distributions
Maroua Haddad Learning possibilistic networks from data 15/21
24. Introduction and context State of the art Conclusion Perspectives
... by using Probability-Possibility transformation
Confidence interval
[De Campos and Huete, 2001]
mixture of [Klir and Parviz, 1992] and [Dubois et al., 1993,
2004] transformations
deals with 2 probability distributions min and max.
, applicable to local conditional possibility distributions
Inconvenients
/ but do not conserve joint possibility distribution
Maroua Haddad Learning possibilistic networks from data 15/21
25. Introduction and context State of the art Conclusion Perspectives
... directly from data
Using probability transformation with confidence intervals ?
[Masson and Denoeux, 2006] Certain data
consider a confidence interval for the estimation of the joint
probability distribution
find all the possibility distributions compatible with these
constraints
... but sometimes, return no solution
Maroua Haddad Learning possibilistic networks from data 16/21
26. Introduction and context State of the art Conclusion Perspectives
... directly from data
Using probability transformation with confidence intervals ?
Using possibilistic histograms ?
[Joslyn, 1994]
certain data, imprecise data, interval data
conditional distributions
, satisfy Minimum Non-Specificity principle
Maroua Haddad Learning possibilistic networks from data 16/21
27. Introduction and context State of the art Conclusion Perspectives
... directly from data
Using probability transformation with confidence intervals ?
Using possibilistic histograms ?
Conclusion
no existing solution for parameter learning
some possible solutions (possibilistic histograms) for PROD
networks
Maroua Haddad Learning possibilistic networks from data 16/21
28. Introduction and context State of the art Conclusion Perspectives
Structure learning
Constraint-based methods
no existing algorithm
how to measure possibilistic dependency ?
possibilistic mutual information (imprecise data) [Borgelt and
Kruse, 2003]
possibilistic 2 (imprecise data) [Borgelt and Kruse, 2003]
Maroua Haddad Learning possibilistic networks from data 17/21
29. Introduction and context State of the art Conclusion Perspectives
Structure learning
Constraint-based methods
Score-based methods
adaptation of BN algorithms for certain data [Gebhardt and
Kruse, 1996] or imprecise data [Borgelt and Gebhardt, 1997]
[Borgelt and Kruse, 2003]
/ discordance between global and local scores
/ do not take into account the Minimum Non-Specificity
principle
Maroua Haddad Learning possibilistic networks from data 17/21
30. Introduction and context State of the art Conclusion Perspectives
Structure learning
Constraint-based methods
Score-based methods
Hybrid methods
POSSCAUSE algorithm (certain data) [Sangüesa et al., 1998]
/ incoherence in the definition of the independence test (one
different formula per paper)
Maroua Haddad Learning possibilistic networks from data 17/21
31. Introduction and context State of the art Conclusion Perspectives
Structure learning
Constraint-based methods
Score-based methods
Hybrid methods
Conclusion
existing but not satisfying solutions for PROD networks
lot of open questions : for instance, what is Markov
equivalence here ?
Maroua Haddad Learning possibilistic networks from data 17/21
32. Introduction and context State of the art Conclusion Perspectives
In practice
Applications
automotive Industry [Gebhardt and Kruse, 1995]
aerospace [Kruze and Borgelt, 1998]
information retrieval [Boughanem et al. 2008]
Maroua Haddad Learning possibilistic networks from data 18/21
33. Introduction and context State of the art Conclusion Perspectives
In practice
Applications
automotive Industry [Gebhardt and Kruse, 1995]
aerospace [Kruze and Borgelt, 1998]
information retrieval [Boughanem et al. 2008]
Implementations
possibilistic inference
POSSINFER [Gebhardt and Kruse, 1995]
Pulcinella [Saffiotti and Umkehrer, 1991]
Possibilistic networks Toolbox for Matlab [Ben Amor, 2012]
Maroua Haddad Learning possibilistic networks from data 18/21
34. Introduction and context State of the art Conclusion Perspectives
Outline ...
1. Introduction and context
2. State of the art
2.1. Possibilistic network, inference, sampling
2.2. Parameter learning
2.3. Structure learning
3. Conclusion Perspectives
Maroua Haddad Learning possibilistic networks from data 19/21
35. Introduction and context State of the art Conclusion Perspectives
Conclusion
Possibilistic networks
another uncertainty theory devoted to imprecision modelling
possibilistic networks essentially studied in term of possibilistic
inference
lot of open problem for the learning task
MIN networks are more dedicated for expert elicitation ? ? ?
Maroua Haddad Learning possibilistic networks from data 20/21
36. Introduction and context State of the art Conclusion Perspectives
Conclusion
Possibilistic networks
Perspectives
new learning algorithm (parameters + structure) based on
consistent theoretical properties
validation ? we need benchmarks or data generation from a
possibilistic distribution
Matlab implementation
using more sophisticated imperfect data (imprecise data,
possibilistic data, interval data...)
Maroua Haddad Learning possibilistic networks from data 20/21