SlideShare une entreprise Scribd logo
1  sur  42
Télécharger pour lire hors ligne
Machine LearningMachine Learninggg
Exact InferenceExact Inference
Eric XingEric Xing
Lecture 11, August 14, 2010
AAA
B A
C
B A
C
B A
C
A
DC
A
DC
B AB A AA
fm
bmcm
dm
Eric Xing © Eric Xing @ CMU, 2006-2010 1
Reading:
E F
H
E F
H
E F
H
E FE FE F
E
G
E
G
E
G
A
DC
E
A
DC
E
A
DC
E
DC DC
hm
gm
em
fm
Inference and LearningInference and Learning
 We now have compact representations of probability
distributions: GM
 A BN M describes a unique probability distribution P
 Typical tasks:
 Task 1: How do we answer queries about P?
 We use inference as a name for the process of computing answers to such
queries
 Task 2: How do we estimate a plausible model M from data D? Task 2: How do we estimate a plausible model M from data D?
i. We use learning as a name for the process of obtaining point estimate of M.
ii. But for Bayesian, they seek p(M |D), which is actually an inference problem.
Eric Xing © Eric Xing @ CMU, 2006-2010 2
iii. When not all variables are observable, even computing point estimate of M
need to do inference to impute the missing data.
Inferential Query 1:
LikelihoodLikelihood
 Most of the queries one may ask involve evidenceq y
 Evidence xv is an assignment of values to a set Xv of nodes in the GM
over varialbe set X={X1, X2, …, Xn}
 Without loss of generality Xv={Xk+1, … , Xn},
 Write XH=XXv as the set of hidden variables, XH can be or X
 Simplest query: compute probability of evidence
∑ ∑ )(),,()(
1
1
x x
k
k
,,x,xPPP v
x
vHv xXXx
H
 
Eric Xing © Eric Xing @ CMU, 2006-2010 3
 this is often referred to as computing the likelihood of xv
Inferential Query 2:
Conditional Probability
 Often we are interested in the conditional probability
Conditional Probability
p y
distribution of a variable given the evidence
VHVH xXxX
xXX
),(),(
)|(
PP
P
 this is the a posteriori belief in XH, given evidence xv
 

Hx
VHH
VH
V
VH
VVH
xxXx
xXX
),()(
)|(
PP
P
this is the a posteriori belief in XH, given evidence xv
 We usually query a subset Y of all hidden variables XH={Y,Z}
and "don't care" about the remaining Z:and don t care about the remaining, Z:
 
z
VV xzZYxY )|,()|( PP
Eric Xing © Eric Xing @ CMU, 2006-2010 4
 the process of summing out the "don't care" variables z is called
marginalization, and the resulting P(Y|xv) is called a marginal prob.
Applications of a posteriori Belief
 Prediction: what is the probability of an outcome given the starting
?
Applications of a posteriori Belief
condition
 the query node is a descendent of the evidence
A CB
?
q y
 Diagnosis: what is the probability of disease/fault given symptoms
A CB
?
 the query node an ancestor of the evidence
 Learning under partial observation
A CB
g p
 fill in the unobserved values under an "EM" setting (more later)
 The directionality of information flow between variables is not
Eric Xing © Eric Xing @ CMU, 2006-2010 5
y
restricted by the directionality of the edges in a GM
 probabilistic inference can combine evidence form all parts of the network
Inferential Query 3:
Most Probable Assignment
 In this query we want to find the most probable joint
Most Probable Assignment
assignment (MPA) for some variables of interest
 Such reasoning is usually performed under some given
evidence xv, and ignoring (the values of) other variables Z:
 
z
VyVyV xzZYxYxY )|,(maxarg)|(maxarg|*
PP
 this is the maximum a posteriori configuration of Y.
Eric Xing © Eric Xing @ CMU, 2006-2010 6
Complexity of Inference
Thm:
Complexity of Inference
Thm:
Computing P(XH=xH| xv) in an arbitrary GM is NP-hard
 Hardness does not mean we cannot solve inference
 It implies that we cannot find a general procedure that works efficiently
for arbitrary GMsy
 For particular families of GMs, we can have provably efficient
procedures
Eric Xing © Eric Xing @ CMU, 2006-2010 7
Approaches to inferenceApproaches to inference
 Exact inference algorithmsg
 The elimination algorithm
 Belief propagationp p g
 The junction tree algorithms (but will not cover in detail here)
 Approximate inference techniques Approximate inference techniques
 Variational algorithmsVariational algorithms
 Stochastic simulation / sampling methods
 Markov chain Monte Carlo methods
Eric Xing © Eric Xing @ CMU, 2006-2010 8
Inference on General BN via
Variable Elimination
General idea:
Variable Elimination
 Write query in the form
  ii paxPXP 1 )|(),( e
 this suggests an "elimination order" of latent variables to be marginalized
 Iteratively
 
nx x x i
ii paxPXP
3 2
1 )|(),( e
 Iteratively
 Move all irrelevant terms outside of innermost sum
 Perform innermost sum, getting a new term
I t th t i t th d t Insert the new term into the product
 wrap-up
),(
)|(
eXP
XP 1
Eric Xing © Eric Xing @ CMU, 2006-2010 9
)(
),(
)|(
e
e
P
XP 1
1 
Hidden Markov ModelHidden Markov Model
y2 y3y1 yT...
p(x y) = p(x x y y )
A AA Ax2 x3x1 xT
y2 y3y1 yT...
...
p(x, y) = p(x1……xT, y1, ……, yT)
= p(y1) p(x1 | y1) p(y2 | y1) p(x2 | y2) … p(yT | yT-1) p(xT | yT)
C diti l b bilitConditional probability:
Eric Xing © Eric Xing @ CMU, 2006-2010 10
Hidden Markov ModelHidden Markov Model
y2 y3y1 yT...
Conditional probability:
A AA Ax2 x3x1 xT
y2 y3y1 yT...
...
Eric Xing © Eric Xing @ CMU, 2006-2010 11
A Bayesian network
A food web
A Bayesian network
A food web
B A
DC
E FE F
G H
Eric Xing © Eric Xing @ CMU, 2006-2010 12
What is the probability that hawks are leaving given that the grass condition is poor?
Example: Variable Elimination
 Query: P(A |h)
B A
Example: Variable Elimination
 Need to eliminate: B,C,D,E,F,G,H
 Initial factors:
B A
DC
 Choose an elimination order: H,G,F,E,D,C,B
E F
G H
),|()|()|(),|()|()|()()( fehPegPafPdcePadPbcPbPaP
 Step 1:
 Conditioning (fix the evidence node (i.e., h) on its observed value (i.e., )):h
~
 This step is isomorphic to a marginalization step:
),|
~
(),( fehhpfemh 
B A
DC
Eric Xing © Eric Xing @ CMU, 2006-2010 13
 
h
h hhfehpfem )
~
(),|(),(  E F
G
Example: Variable Elimination
 Query: P(B |h)
B A
Example: Variable Elimination
 Need to eliminate: B,C,D,E,F,G
 Initial factors:
B A
DC
E F
G H
),()|()|(),|()|()|()()(
),|()|()|(),|()|()|()()(
femegPafPdcePadPbcPbPaP
fehPegPafPdcePadPbcPbPaP
h
 Step 2: Eliminate G
t compute
1)|()(  g
g egpem
B A
DC)()()|()|()|()|()()( fememafPdcePadPbcPbPaP h
Eric Xing © Eric Xing @ CMU, 2006-2010 14
E F
),()|(),|()|()|()()(
),()()|(),|()|()|()()(
femafPdcePadPbcPbPaP
fememafPdcePadPbcPbPaP
h
hg


Example: Variable Elimination
 Query: P(B |h)
B A
Example: Variable Elimination
 Need to eliminate: B,C,D,E,F
 Initial factors:
B A
DC
E F
G H),()|(),|()|()|()()(
),()|()|(),|()|()|()()(
),|()|()|(),|()|()|()()(
femafPdcePadPbcPbPaP
femegPafPdcePadPbcPbPaP
fehPegPafPdcePadPbcPbPaP
h
h


 Step 3: Eliminate F
t
),()|(),|()|()|()()( ff h
 compute

f
hf femafpaem ),()|(),(
)()|()|()|()()( eamdcePadPbcPbPaP
B A
DC
Eric Xing © Eric Xing @ CMU, 2006-2010 15
),(),|()|()|()()( eamdcePadPbcPbPaP f
E
Example: Variable Elimination
 Query: P(B |h)
B A
Example: Variable Elimination
 Need to eliminate: B,C,D,E
 Initial factors:
B A
DC
E F
G H),()|(),|()|()|()()(
),()|()|(),|()|()|()()(
),|()|()|(),|()|()|()()(
femafPdcePadPbcPbPaP
femegPafPdcePadPbcPbPaP
fehPegPafPdcePadPbcPbPaP
h
h


 Step 4: Eliminate E
t
),(),|()|()|()()(
),()|(),|()|()|()()(
eamdcePadPbcPbPaP
ff
f
h

B A
DC
 compute

e
fe eamdcepdcam ),(),|(),,(
)()|()|()()( dcamadPbcPbPaP
B A
DC
Eric Xing © Eric Xing @ CMU, 2006-2010 16
E
),,()|()|()()( dcamadPbcPbPaP e
Example: Variable Elimination
 Query: P(B |h)
B A
Example: Variable Elimination
 Need to eliminate: B,C,D
 Initial factors:
B A
DC
E F
G H),()|(),|()|()|()()(
),()|()|(),|()|()|()()(
),|()|()|(),|()|()|()()(
femafPdcePadPbcPbPaP
femegPafPdcePadPbcPbPaP
fehPegPafPdcePadPbcPbPaP
h
h


),,()|()|()()(
),(),|()|()|()()(
dcamadPbcPbPaP
eamdcePadPbcPbPaP
e
f


 Step 5: Eliminate D
 compute
 ed dcamadpcam ),,()|(),(
B A
C
Eric Xing © Eric Xing @ CMU, 2006-2010 17
d
ed p ),,()|(),(
),()|()()( camdcPbPaP d
Example: Variable Elimination
 Query: P(B |h)
B A
Example: Variable Elimination
 Need to eliminate: B,C
 Initial factors:
B A
DC
E F
G H),()|(),|()|()|()()(
),()|()|(),|()|()|()()(
),|()|()|(),|()|()|()()(
femafPdcePadPdcPbPaP
femegPafPdcePadPdcPbPaP
fehPegPafPdcePadPdcPbPaP
h
h


),()|()()(
),,()|()|()()(
),(),|()|()|()()(
camdcPbPaP
dcamadPdcPbPaP
eamdcePadPdcPbPaP
d
e
f



 Step 6: Eliminate C
 compute
 dc cambcpbam ),()|(),(
B A
Eric Xing © Eric Xing @ CMU, 2006-2010 18
),()|()()( camdcPbPaP d
c
dc p ),()|(),(
Example: Variable Elimination
 Query: P(B |h)
B A
Example: Variable Elimination
 Need to eliminate: B
 Initial factors:
B A
DC
E F
G H),()|(),|()|()|()()(
),()|()|(),|()|()|()()(
),|()|()|(),|()|()|()()(
femafPdcePadPdcPbPaP
femegPafPdcePadPdcPbPaP
fehPegPafPdcePadPdcPbPaP
h
h


),()|()()(
),,()|()|()()(
),(),|()|()|()()(
camdcPbPaP
dcamadPdcPbPaP
eamdcePadPdcPbPaP
d
e
f



 Step 7: Eliminate B
 compute
),()()( bambPaP c
 cb bambpam ),()()(
A
Eric Xing © Eric Xing @ CMU, 2006-2010 19
b
cb p ),()()(
)()( amaP b
Example: Variable Elimination
 Query: P(B |h)
B A
Example: Variable Elimination
 Need to eliminate: B
 Initial factors:
B A
DC
E F
G H),()|(),|()|()|()()(
),()|()|(),|()|()|()()(
),|()|()|(),|()|()|()()(
femafPdcePadPdcPbPaP
femegPafPdcePadPdcPbPaP
fehPegPafPdcePadPdcPbPaP
h
h


),()|()()(
),,()|()|()()(
),(),|()|()|()()(
camdcPbPaP
dcamadPdcPbPaP
eamdcePadPdcPbPaP
d
e
f



 Step 8: Wrap-up
)()(
),()()(
amaP
bambPaP
b
c


,)()()
~
,( amaphap b  b amaphp )()()
~
(
Eric Xing © Eric Xing @ CMU, 2006-2010 20
Step 8 ap up ,)()(),( pp b


a
b
b
amap
amap
haP
)()(
)()(
)
~
|(
a
b amaphp )()()(
Complexity of variable
elimination
 Suppose in one elimination step we compute
elimination

x
kxkx yyxmyym ),,,('),,( 11 

k
xmyyxm )()(' y
This requires
 multiplications


i
cikx i
xmyyxm
1
1 ),(),,,( y
 CXk )Val()Val( Y p
─ For each value of x, y1, …, yk, we do k multiplications
i
Ci
)()(
 additions
─ For each value of y1, …, yk , we do |Val(X)| additions

i
Ci
X )Val()Val( Y
Eric Xing © Eric Xing @ CMU, 2006-2010 21
Complexity is exponential in number of variables in the
intermediate factor
Elimination CliquesElimination Cliques
B AB A
DC
E F
G H
B A
DC
E F
G H
B A
DC
B A
DC
G H
B A
DC
B A
DCD
E F
G H
D
E F
G
DC
E F
DC
E
)( fem )(em )( aem )( dcam
B A
DC
B A
C
B A A
),( femh )(emg ),( aemf ),,( dcame
Eric Xing © Eric Xing @ CMU, 2006-2010 22
D
),( camd ),( bamc )(amb
Understanding Variable
EliminationElimination
 A graph elimination algorithm
B A
DC
E F
B A
DC
E F
B A
DC
B A
DC
E F
B A
DC
E F
B A
DC
E
B A
C
B A A
moralization
G H G H G
graph elimination
 Intermediate terms correspond to the cliques resulted from
elimination
 “good” elimination orderings lead to small cliques and hence reduce complexitygood elimination orderings lead to small cliques and hence reduce complexity
(what will happen if we eliminate "e" first in the above graph?)
 finding the optimum ordering is NP-hard, but for many graph optimum or near-
optimum can often be heuristically found
Eric Xing © Eric Xing @ CMU, 2006-2010 23
p y
 Applies to undirected GMs
From Elimination to Belief
PropagationPropagation
 Recall that Induced dependency during marginalization isp y g g
captured in elimination cliques
 Summation <-> elimination
 Intermediate term <-> elimination cliqueq
A
B A
C
A
B A A
A
E F
A
DC
A
DC
Eric Xing © Eric Xing @ CMU, 2006-2010 24
 Can this lead to an generic
inference algorithm?
E F
H
E
G
E
Tree GMsTree GMs
Undirected tree: a
unique path between
Directed tree: all
nodes except the root
ha e e actl one
Poly tree: can have
multiple parents
Eric Xing © Eric Xing @ CMU, 2006-2010 25
any pair of nodes have exactly one
parent
Equivalence of directed and
undirected trees
 Any undirected tree can be converted to a directed tree by choosing a root
undirected trees
node and directing all edges away from it
 A directed tree and the corresponding undirected tree make the same
conditional independence assertionsp
 Parameterizations are essentially the same.
 Undirected tree: Undirected tree:
 Directed tree:
 Equivalence:
Eric Xing © Eric Xing @ CMU, 2006-2010 26
 Evidence:?
From elimination to message
passingpassing
 Recall ELIMINATION algorithm:
 Choose an ordering Z in which query node f is the final node
 Place all potentials on an active list
 Eliminate node i by removing all potentials containing i, take sum/product over xi.
Place the resultant factor back on the list Place the resultant factor back on the list
 For a TREE graph:
 Choose query node f as the root of the tree
 View tree as a directed tree with edges pointing towards from f
 Elimination ordering based on depth-first traversal
 Elimination of each node can be considered as message-passing (or Belief Propagation) Elimination of each node can be considered as message passing (or Belief Propagation)
directly along tree branches, rather than on some transformed graphs
 thus, we can use the tree itself as a data-structure to do general inference!!
Eric Xing © Eric Xing @ CMU, 2006-2010 27
Message passing for treesMessage passing for trees
Let m (x ) denote the factor resulting from
f
Let mij(xi) denote the factor resulting from
eliminating variables from bellow up to i,
which is a function of xi:
i This is reminiscent of a message sent
from j to i.
j
k l
Eric Xing © Eric Xing @ CMU, 2006-2010 28
k l
mij(xi) represents a "belief" of xi from xj!
 Elimination on trees is equivalent to message passing alongq g p g g
tree branches!
f
ii
j
Eric Xing © Eric Xing @ CMU, 2006-2010 29
k l
The message passing protocol:The message passing protocol:
 A two-pass algorithm:p g
X1
(X ) (X )
X
m21(X 1)
m32(X 2) m42(X 2)
m12(X 2)
X2
X3
X4
m32(X 2) m42(X 2)
Eric Xing © Eric Xing @ CMU, 2006-2010 30
m24(X 4)
3
m23(X 3)
Belief Propagation (SP-algorithm):
Sequential implementationSequential implementation
Eric Xing © Eric Xing @ CMU, 2006-2010 31
Belief Propagation (SP-algorithm):
Parallel synchronous implementationParallel synchronous implementation
 For a node of degree d, whenever messages have arrived on any subset of d-1 node,
compute the message for the remaining edge and send!
Eric Xing © Eric Xing @ CMU, 2006-2010 32
compute the message for the remaining edge and send!
 A pair of messages have been computed for each edge, one for each direction
 All incoming messages are eventually computed for each node
Correctness of BP on treeCorrectness of BP on tree
 Collollary: the synchronous implementation is "non-blocking"
 Thm: The Message Passage Guarantees obtaining all
marginals in the tree
 What about non-tree?
Eric Xing © Eric Xing @ CMU, 2006-2010 33
Inference on general GMInference on general GM
 Now, what if the GM is not a tree-like graph?
 Can we still directly run message
i t l l it d ?message-passing protocol along its edges?
 For non-trees, we do not have the guarantee that message-passingg g p g
will be consistent!
 Then what? Then what?
 Construct a graph data-structure from P that has a tree structure, and run message-passing
on it!
Eric Xing © Eric Xing @ CMU, 2006-2010 34
 Junction tree algorithm
Elimination Clique
 Recall that Induced dependency during marginalization is
Elimination Clique
p y g g
captured in elimination cliques
 Summation <-> elimination
 Intermediate term <-> elimination cliqueq
A
B A
C
A
B A A
A
E F
A
DC
A
DC
Eric Xing © Eric Xing @ CMU, 2006-2010 35
 Can this lead to an generic
inference algorithm?
E F
H
E
G
E
A Clique TreeA Clique Tree
B A
C
B A A
bmcm
A
C
A
DC
fm
dm
E F
A
DC
E hm
em
E F
H
E
G
E h
gm
Eric Xing © Eric Xing @ CMU, 2006-2010 36

e
fg
e
eamemdcep
dcam
),()(),|(
),,(
From Elimination to Message
Passing
 Elimination  message passing on a clique tree
Passing
 Elimination  message passing on a clique tree
B A
DC
E F
B A
DC
E F
B A
DC
B A
DC
E F
B A
DC
E F
B A
DC
E
B A
C
B A A
B A
C
B A A
bmcm
G H G H G

A
E F
A
DC
A
DC
em
fm
dm

e dcam ),,(
E F
H
E
G
DC
E hm
gm

e
fg eamemdcep ),()(),|(
Eric Xing © Eric Xing @ CMU, 2006-2010 37
 Messages can be reused
From Elimination to Message
PassingPassing
 Elimination  message passing on a clique tree Elimination  message passing on a clique tree
 Another query ...
B A
C
B A A
cm bm
A
E F
A
DC
A
DC
em
dm
fm
E F
H
E
G
DC
E
gm
hm
Eric Xing © Eric Xing @ CMU, 2006-2010 38
 Messages mf and mh are reused, others need to be recomputed
The Shafer Shenoy AlgorithmThe Shafer Shenoy Algorithm
 Shafer-Shenoy algorithmy g
 Message from clique i to clique j :
  S )(
 Clique marginal
 
 
iji
i
SC jk
kiikCji S

)(
 SCp )()( 
Eric Xing © Eric Xing @ CMU, 2006-2010 39
 
k
kiikCi SCp i
)()( 
A Sketch of the Junction Tree
AlgorithmAlgorithm
 The algorithmg
 Construction of junction trees --- a special clique tree
 Propagation of probabilities --- a message-passing protocol
 Results in marginal probabilities of all cliques --- solves all
queries in a single run
 A generic exact inference algorithm for any GM A generic exact inference algorithm for any GM
 Complexity: exponential in the size of the maximal clique ---
a good elimination order often leads to small maximal clique,
and hence a good (i.e., thin) JT
 Many well-known algorithms are special cases of JT
Eric Xing © Eric Xing @ CMU, 2006-2010 40
 Forward-backward, Kalman filter, Peeling, Sum-Product ...
The Junction tree algorithm for HMMThe Junction tree algorithm for HMM
 A junction tree for the HMM ),( 11 xy ),( 21 yy ),( 32 yy ),( TT yy 1
A AA Ax2 x3x1 xT
y2 y3y1 yT
...
...
y yy yy yy
)( 2y )( 3y )( Ty
)( 1y )( 2y

 Rightward pass ),( 22 xy ),( 33 xy ),( TT xy
),( 1tt yy)( ttt y1 )( 11  ttt y
 )|()()|(
  
ty
tttttttttt yyyyy )()(),()( 11111 
 This is exactly the forward algorithm! ),( 11  tt xy
)( 1 tt
y







t
tt
t
y
tttyytt
y
ttttttt
yayxp
yxpyyyp
)()|(
)|()()|(
, 111
1111
1


y g
 Leftward pass …


 
1
11111
ty
tttttttttt yyyyy )()(),()( 
),( 1tt yy)( ttt y1 )( 11  ttt y
)( 1 ty
Eric Xing © Eric Xing @ CMU, 2006-2010 41
 This is exactly the backward algorithm!
1ty



1
11111
ty
ttttttt yxpyyyp )|()()|( 
),( 11  tt xy
)( 1 tt
y
SummarySummary
 The simple Eliminate algorithm captures the key algorithmic
Operation underlying probabilistic inference:
--- That of taking a sum over product of potential functions
 The computational complexity of the Eliminate algorithm can be
reduced to purely graph-theoretic considerations.
 This graph interpretation will also provide hints about how to design
improved inference algorithms
 What can we say about the overall computational complexity of the
algorithm? In particular, how can we control the "size" of the
Eric Xing © Eric Xing @ CMU, 2006-2010 42
summands that appear in the sequence of summation operation.

Contenu connexe

Tendances

My data are incomplete and noisy: Information-reduction statistical methods f...
My data are incomplete and noisy: Information-reduction statistical methods f...My data are incomplete and noisy: Information-reduction statistical methods f...
My data are incomplete and noisy: Information-reduction statistical methods f...Umberto Picchini
 
Predictive Modeling in Insurance in the context of (possibly) big data
Predictive Modeling in Insurance in the context of (possibly) big dataPredictive Modeling in Insurance in the context of (possibly) big data
Predictive Modeling in Insurance in the context of (possibly) big dataArthur Charpentier
 
Machine Learning in Actuarial Science & Insurance
Machine Learning in Actuarial Science & InsuranceMachine Learning in Actuarial Science & Insurance
Machine Learning in Actuarial Science & InsuranceArthur Charpentier
 
Accelerating Metropolis Hastings with Lightweight Inference Compilation
Accelerating Metropolis Hastings with Lightweight Inference CompilationAccelerating Metropolis Hastings with Lightweight Inference Compilation
Accelerating Metropolis Hastings with Lightweight Inference CompilationFeynman Liang
 
Approximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forestsApproximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forestsChristian Robert
 
A likelihood-free version of the stochastic approximation EM algorithm (SAEM)...
A likelihood-free version of the stochastic approximation EM algorithm (SAEM)...A likelihood-free version of the stochastic approximation EM algorithm (SAEM)...
A likelihood-free version of the stochastic approximation EM algorithm (SAEM)...Umberto Picchini
 
A Tutorial of the EM-algorithm and Its Application to Outlier Detection
A Tutorial of the EM-algorithm and Its Application to Outlier DetectionA Tutorial of the EM-algorithm and Its Application to Outlier Detection
A Tutorial of the EM-algorithm and Its Application to Outlier DetectionKonkuk University, Korea
 
When Classifier Selection meets Information Theory: A Unifying View
When Classifier Selection meets Information Theory: A Unifying ViewWhen Classifier Selection meets Information Theory: A Unifying View
When Classifier Selection meets Information Theory: A Unifying ViewMohamed Farouk
 
better together? statistical learning in models made of modules
better together? statistical learning in models made of modulesbetter together? statistical learning in models made of modules
better together? statistical learning in models made of modulesChristian Robert
 
from model uncertainty to ABC
from model uncertainty to ABCfrom model uncertainty to ABC
from model uncertainty to ABCChristian Robert
 
Convergence of ABC methods
Convergence of ABC methodsConvergence of ABC methods
Convergence of ABC methodsChristian Robert
 
Hands-On Algorithms for Predictive Modeling
Hands-On Algorithms for Predictive ModelingHands-On Algorithms for Predictive Modeling
Hands-On Algorithms for Predictive ModelingArthur Charpentier
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationChristian Robert
 
An overview of Bayesian testing
An overview of Bayesian testingAn overview of Bayesian testing
An overview of Bayesian testingChristian Robert
 

Tendances (20)

Boston talk
Boston talkBoston talk
Boston talk
 
My data are incomplete and noisy: Information-reduction statistical methods f...
My data are incomplete and noisy: Information-reduction statistical methods f...My data are incomplete and noisy: Information-reduction statistical methods f...
My data are incomplete and noisy: Information-reduction statistical methods f...
 
Predictive Modeling in Insurance in the context of (possibly) big data
Predictive Modeling in Insurance in the context of (possibly) big dataPredictive Modeling in Insurance in the context of (possibly) big data
Predictive Modeling in Insurance in the context of (possibly) big data
 
Intractable likelihoods
Intractable likelihoodsIntractable likelihoods
Intractable likelihoods
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Machine Learning in Actuarial Science & Insurance
Machine Learning in Actuarial Science & InsuranceMachine Learning in Actuarial Science & Insurance
Machine Learning in Actuarial Science & Insurance
 
Side 2019 #9
Side 2019 #9Side 2019 #9
Side 2019 #9
 
Accelerating Metropolis Hastings with Lightweight Inference Compilation
Accelerating Metropolis Hastings with Lightweight Inference CompilationAccelerating Metropolis Hastings with Lightweight Inference Compilation
Accelerating Metropolis Hastings with Lightweight Inference Compilation
 
Approximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forestsApproximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forests
 
A likelihood-free version of the stochastic approximation EM algorithm (SAEM)...
A likelihood-free version of the stochastic approximation EM algorithm (SAEM)...A likelihood-free version of the stochastic approximation EM algorithm (SAEM)...
A likelihood-free version of the stochastic approximation EM algorithm (SAEM)...
 
A Tutorial of the EM-algorithm and Its Application to Outlier Detection
A Tutorial of the EM-algorithm and Its Application to Outlier DetectionA Tutorial of the EM-algorithm and Its Application to Outlier Detection
A Tutorial of the EM-algorithm and Its Application to Outlier Detection
 
When Classifier Selection meets Information Theory: A Unifying View
When Classifier Selection meets Information Theory: A Unifying ViewWhen Classifier Selection meets Information Theory: A Unifying View
When Classifier Selection meets Information Theory: A Unifying View
 
better together? statistical learning in models made of modules
better together? statistical learning in models made of modulesbetter together? statistical learning in models made of modules
better together? statistical learning in models made of modules
 
from model uncertainty to ABC
from model uncertainty to ABCfrom model uncertainty to ABC
from model uncertainty to ABC
 
Convergence of ABC methods
Convergence of ABC methodsConvergence of ABC methods
Convergence of ABC methods
 
02 math essentials
02 math essentials02 math essentials
02 math essentials
 
Hands-On Algorithms for Predictive Modeling
Hands-On Algorithms for Predictive ModelingHands-On Algorithms for Predictive Modeling
Hands-On Algorithms for Predictive Modeling
 
ABC workshop: 17w5025
ABC workshop: 17w5025ABC workshop: 17w5025
ABC workshop: 17w5025
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimation
 
An overview of Bayesian testing
An overview of Bayesian testingAn overview of Bayesian testing
An overview of Bayesian testing
 

En vedette

Linkedin - Facts about Linkedin and why you need to be on it
Linkedin  - Facts about Linkedin and why you need to be on itLinkedin  - Facts about Linkedin and why you need to be on it
Linkedin - Facts about Linkedin and why you need to be on itThe Pathway Group
 
Social Recruiting & Professional Networking (Viadeo)
Social Recruiting & Professional Networking (Viadeo)Social Recruiting & Professional Networking (Viadeo)
Social Recruiting & Professional Networking (Viadeo)Crexia
 
Exact Inference in Bayesian Networks using MapReduce__HadoopSummit2010
Exact Inference in Bayesian Networks using MapReduce__HadoopSummit2010Exact Inference in Bayesian Networks using MapReduce__HadoopSummit2010
Exact Inference in Bayesian Networks using MapReduce__HadoopSummit2010Yahoo Developer Network
 
Viadeo Student Challenge : présentation de la 2ème meilleure équipe !
Viadeo Student Challenge : présentation de la 2ème meilleure équipe !Viadeo Student Challenge : présentation de la 2ème meilleure équipe !
Viadeo Student Challenge : présentation de la 2ème meilleure équipe !Viadeo
 
Introducing the XING Partner Ecosystem
Introducing the XING Partner EcosystemIntroducing the XING Partner Ecosystem
Introducing the XING Partner EcosystemXING AG
 
Mobile LBS Summit 2010, Wiesbaden
Mobile LBS Summit 2010, WiesbadenMobile LBS Summit 2010, Wiesbaden
Mobile LBS Summit 2010, WiesbadenJackson Bond
 
Don’t forget to XING – das Businessnetzwerk im Social-Media–Mix #afbmc
Don’t forget to XING – das Businessnetzwerk im Social-Media–Mix #afbmcDon’t forget to XING – das Businessnetzwerk im Social-Media–Mix #afbmc
Don’t forget to XING – das Businessnetzwerk im Social-Media–Mix #afbmcAllFacebook.de
 

En vedette (10)

Linkedin - Facts about Linkedin and why you need to be on it
Linkedin  - Facts about Linkedin and why you need to be on itLinkedin  - Facts about Linkedin and why you need to be on it
Linkedin - Facts about Linkedin and why you need to be on it
 
Presentation 20100428
Presentation 20100428Presentation 20100428
Presentation 20100428
 
Social Recruiting & Professional Networking (Viadeo)
Social Recruiting & Professional Networking (Viadeo)Social Recruiting & Professional Networking (Viadeo)
Social Recruiting & Professional Networking (Viadeo)
 
LinkedIn vs Xing
LinkedIn vs XingLinkedIn vs Xing
LinkedIn vs Xing
 
Exact Inference in Bayesian Networks using MapReduce__HadoopSummit2010
Exact Inference in Bayesian Networks using MapReduce__HadoopSummit2010Exact Inference in Bayesian Networks using MapReduce__HadoopSummit2010
Exact Inference in Bayesian Networks using MapReduce__HadoopSummit2010
 
Xing
XingXing
Xing
 
Viadeo Student Challenge : présentation de la 2ème meilleure équipe !
Viadeo Student Challenge : présentation de la 2ème meilleure équipe !Viadeo Student Challenge : présentation de la 2ème meilleure équipe !
Viadeo Student Challenge : présentation de la 2ème meilleure équipe !
 
Introducing the XING Partner Ecosystem
Introducing the XING Partner EcosystemIntroducing the XING Partner Ecosystem
Introducing the XING Partner Ecosystem
 
Mobile LBS Summit 2010, Wiesbaden
Mobile LBS Summit 2010, WiesbadenMobile LBS Summit 2010, Wiesbaden
Mobile LBS Summit 2010, Wiesbaden
 
Don’t forget to XING – das Businessnetzwerk im Social-Media–Mix #afbmc
Don’t forget to XING – das Businessnetzwerk im Social-Media–Mix #afbmcDon’t forget to XING – das Businessnetzwerk im Social-Media–Mix #afbmc
Don’t forget to XING – das Businessnetzwerk im Social-Media–Mix #afbmc
 

Similaire à Lecture11 xing

3rd NIPS Workshop on PROBABILISTIC PROGRAMMING
3rd NIPS Workshop on PROBABILISTIC PROGRAMMING3rd NIPS Workshop on PROBABILISTIC PROGRAMMING
3rd NIPS Workshop on PROBABILISTIC PROGRAMMINGChristian Robert
 
Probability cheatsheet
Probability cheatsheetProbability cheatsheet
Probability cheatsheetJoachim Gwoke
 
Likelihood free computational statistics
Likelihood free computational statisticsLikelihood free computational statistics
Likelihood free computational statisticsPierre Pudlo
 
Lecture13 xing fei-fei
Lecture13 xing fei-feiLecture13 xing fei-fei
Lecture13 xing fei-feiTianlu Wang
 
Probability cheatsheet
Probability cheatsheetProbability cheatsheet
Probability cheatsheetSuvrat Mishra
 
An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...
An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...
An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...NTNU
 
Use of the correlation coefficient as a measure of effectiveness of a scoring...
Use of the correlation coefficient as a measure of effectiveness of a scoring...Use of the correlation coefficient as a measure of effectiveness of a scoring...
Use of the correlation coefficient as a measure of effectiveness of a scoring...Wajih Alaiyan
 
從 VAE 走向深度學習新理論
從 VAE 走向深度學習新理論從 VAE 走向深度學習新理論
從 VAE 走向深度學習新理論岳華 杜
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Christian Robert
 
Elementary Probability and Information Theory
Elementary Probability and Information TheoryElementary Probability and Information Theory
Elementary Probability and Information TheoryKhalidSaghiri2
 
Fin500J_topic10_Probability_2010_0000000
Fin500J_topic10_Probability_2010_0000000Fin500J_topic10_Probability_2010_0000000
Fin500J_topic10_Probability_2010_0000000Tushar Chaudhari
 

Similaire à Lecture11 xing (20)

Lecture12 xing
Lecture12 xingLecture12 xing
Lecture12 xing
 
Bayesian_Decision_Theory-3.pdf
Bayesian_Decision_Theory-3.pdfBayesian_Decision_Theory-3.pdf
Bayesian_Decision_Theory-3.pdf
 
Lecture15 xing
Lecture15 xingLecture15 xing
Lecture15 xing
 
3rd NIPS Workshop on PROBABILISTIC PROGRAMMING
3rd NIPS Workshop on PROBABILISTIC PROGRAMMING3rd NIPS Workshop on PROBABILISTIC PROGRAMMING
3rd NIPS Workshop on PROBABILISTIC PROGRAMMING
 
Ab cancun
Ab cancunAb cancun
Ab cancun
 
Probability cheatsheet
Probability cheatsheetProbability cheatsheet
Probability cheatsheet
 
Likelihood free computational statistics
Likelihood free computational statisticsLikelihood free computational statistics
Likelihood free computational statistics
 
Lecture13 xing fei-fei
Lecture13 xing fei-feiLecture13 xing fei-fei
Lecture13 xing fei-fei
 
Probability Cheatsheet.pdf
Probability Cheatsheet.pdfProbability Cheatsheet.pdf
Probability Cheatsheet.pdf
 
Probability cheatsheet
Probability cheatsheetProbability cheatsheet
Probability cheatsheet
 
Lecture2 xing
Lecture2 xingLecture2 xing
Lecture2 xing
 
An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...
An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...
An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...
 
Use of the correlation coefficient as a measure of effectiveness of a scoring...
Use of the correlation coefficient as a measure of effectiveness of a scoring...Use of the correlation coefficient as a measure of effectiveness of a scoring...
Use of the correlation coefficient as a measure of effectiveness of a scoring...
 
從 VAE 走向深度學習新理論
從 VAE 走向深度學習新理論從 VAE 走向深度學習新理論
從 VAE 走向深度學習新理論
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
 
Econometrics 2017-graduate-3
Econometrics 2017-graduate-3Econometrics 2017-graduate-3
Econometrics 2017-graduate-3
 
ML unit-1.pptx
ML unit-1.pptxML unit-1.pptx
ML unit-1.pptx
 
Elementary Probability and Information Theory
Elementary Probability and Information TheoryElementary Probability and Information Theory
Elementary Probability and Information Theory
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Fin500J_topic10_Probability_2010_0000000
Fin500J_topic10_Probability_2010_0000000Fin500J_topic10_Probability_2010_0000000
Fin500J_topic10_Probability_2010_0000000
 

Plus de Tianlu Wang

14 pro resolution
14 pro resolution14 pro resolution
14 pro resolutionTianlu Wang
 
13 propositional calculus
13 propositional calculus13 propositional calculus
13 propositional calculusTianlu Wang
 
12 adversal search
12 adversal search12 adversal search
12 adversal searchTianlu Wang
 
11 alternative search
11 alternative search11 alternative search
11 alternative searchTianlu Wang
 
21 situation calculus
21 situation calculus21 situation calculus
21 situation calculusTianlu Wang
 
20 bayes learning
20 bayes learning20 bayes learning
20 bayes learningTianlu Wang
 
19 uncertain evidence
19 uncertain evidence19 uncertain evidence
19 uncertain evidenceTianlu Wang
 
18 common knowledge
18 common knowledge18 common knowledge
18 common knowledgeTianlu Wang
 
17 2 expert systems
17 2 expert systems17 2 expert systems
17 2 expert systemsTianlu Wang
 
17 1 knowledge-based system
17 1 knowledge-based system17 1 knowledge-based system
17 1 knowledge-based systemTianlu Wang
 
16 2 predicate resolution
16 2 predicate resolution16 2 predicate resolution
16 2 predicate resolutionTianlu Wang
 
16 1 predicate resolution
16 1 predicate resolution16 1 predicate resolution
16 1 predicate resolutionTianlu Wang
 
09 heuristic search
09 heuristic search09 heuristic search
09 heuristic searchTianlu Wang
 
08 uninformed search
08 uninformed search08 uninformed search
08 uninformed searchTianlu Wang
 

Plus de Tianlu Wang (20)

L7 er2
L7 er2L7 er2
L7 er2
 
L8 design1
L8 design1L8 design1
L8 design1
 
L9 design2
L9 design2L9 design2
L9 design2
 
14 pro resolution
14 pro resolution14 pro resolution
14 pro resolution
 
13 propositional calculus
13 propositional calculus13 propositional calculus
13 propositional calculus
 
12 adversal search
12 adversal search12 adversal search
12 adversal search
 
11 alternative search
11 alternative search11 alternative search
11 alternative search
 
10 2 sum
10 2 sum10 2 sum
10 2 sum
 
22 planning
22 planning22 planning
22 planning
 
21 situation calculus
21 situation calculus21 situation calculus
21 situation calculus
 
20 bayes learning
20 bayes learning20 bayes learning
20 bayes learning
 
19 uncertain evidence
19 uncertain evidence19 uncertain evidence
19 uncertain evidence
 
18 common knowledge
18 common knowledge18 common knowledge
18 common knowledge
 
17 2 expert systems
17 2 expert systems17 2 expert systems
17 2 expert systems
 
17 1 knowledge-based system
17 1 knowledge-based system17 1 knowledge-based system
17 1 knowledge-based system
 
16 2 predicate resolution
16 2 predicate resolution16 2 predicate resolution
16 2 predicate resolution
 
16 1 predicate resolution
16 1 predicate resolution16 1 predicate resolution
16 1 predicate resolution
 
15 predicate
15 predicate15 predicate
15 predicate
 
09 heuristic search
09 heuristic search09 heuristic search
09 heuristic search
 
08 uninformed search
08 uninformed search08 uninformed search
08 uninformed search
 

Dernier

Theoretical Framework- Explanation with Flow Chart.docx
Theoretical Framework- Explanation with Flow Chart.docxTheoretical Framework- Explanation with Flow Chart.docx
Theoretical Framework- Explanation with Flow Chart.docxAman119787
 
WhatsApp-(# 9711106444 #)Call Girl in Noida Sector 80 Noida (Escorts) Delhi
WhatsApp-(# 9711106444 #)Call Girl in Noida Sector 80 Noida (Escorts) DelhiWhatsApp-(# 9711106444 #)Call Girl in Noida Sector 80 Noida (Escorts) Delhi
WhatsApp-(# 9711106444 #)Call Girl in Noida Sector 80 Noida (Escorts) Delhidelhimunirka15
 
Just Call Vip call girls Farrukhabad Escorts ☎️8617370543 Two shot with one g...
Just Call Vip call girls Farrukhabad Escorts ☎️8617370543 Two shot with one g...Just Call Vip call girls Farrukhabad Escorts ☎️8617370543 Two shot with one g...
Just Call Vip call girls Farrukhabad Escorts ☎️8617370543 Two shot with one g...Nitya salvi
 
Call Girls In Dilshad Garden | Contact Me ☎ +91-9953040155
Call Girls In Dilshad Garden | Contact Me ☎ +91-9953040155Call Girls In Dilshad Garden | Contact Me ☎ +91-9953040155
Call Girls In Dilshad Garden | Contact Me ☎ +91-9953040155SaketCallGirlsCallUs
 
Russian Call Girls In Bhubaneswar 📱 Odisha 9777949614 Indore
Russian Call Girls In Bhubaneswar 📱 Odisha 9777949614 IndoreRussian Call Girls In Bhubaneswar 📱 Odisha 9777949614 Indore
Russian Call Girls In Bhubaneswar 📱 Odisha 9777949614 IndoreCall Girls Mumbai
 
SB_ Scott Pilgrim_ Rough_ RiverPhan (2024)
SB_ Scott Pilgrim_ Rough_ RiverPhan (2024)SB_ Scott Pilgrim_ Rough_ RiverPhan (2024)
SB_ Scott Pilgrim_ Rough_ RiverPhan (2024)River / Thao Phan
 
SB_ Pretzel and the puppies_ Rough_ RiverPhan (2024)
SB_ Pretzel and the puppies_ Rough_ RiverPhan (2024)SB_ Pretzel and the puppies_ Rough_ RiverPhan (2024)
SB_ Pretzel and the puppies_ Rough_ RiverPhan (2024)River / Thao Phan
 
Sui Generis Magazine volume one Kristen Murillo.pdf
Sui Generis Magazine volume one Kristen Murillo.pdfSui Generis Magazine volume one Kristen Murillo.pdf
Sui Generis Magazine volume one Kristen Murillo.pdfkristenmurillo218
 
Call Girls In Chattarpur | Contact Me ☎ +91-9953040155
Call Girls In Chattarpur | Contact Me ☎ +91-9953040155Call Girls In Chattarpur | Contact Me ☎ +91-9953040155
Call Girls In Chattarpur | Contact Me ☎ +91-9953040155SaketCallGirlsCallUs
 
Call Girls Varanasi Just Call 8617370543Top Class Call Girl Service Available
Call Girls Varanasi Just Call 8617370543Top Class Call Girl Service AvailableCall Girls Varanasi Just Call 8617370543Top Class Call Girl Service Available
Call Girls Varanasi Just Call 8617370543Top Class Call Girl Service AvailableNitya salvi
 
Call Girls Aligarh Just Call 8617370543 Top Class Call Girl Service Available
Call Girls Aligarh Just Call 8617370543 Top Class Call Girl Service AvailableCall Girls Aligarh Just Call 8617370543 Top Class Call Girl Service Available
Call Girls Aligarh Just Call 8617370543 Top Class Call Girl Service AvailableNitya salvi
 
Call Girl In Chandigarh ☎ 08868886958✅ Just Genuine Call Call Girls Chandigar...
Call Girl In Chandigarh ☎ 08868886958✅ Just Genuine Call Call Girls Chandigar...Call Girl In Chandigarh ☎ 08868886958✅ Just Genuine Call Call Girls Chandigar...
Call Girl In Chandigarh ☎ 08868886958✅ Just Genuine Call Call Girls Chandigar...Sheetaleventcompany
 
9711106444 Ghaziabad, Call Girls @ ₹. 1500– Per Shot Per Night 7000 Delhi
9711106444 Ghaziabad, Call Girls @ ₹. 1500– Per Shot Per Night 7000 Delhi9711106444 Ghaziabad, Call Girls @ ₹. 1500– Per Shot Per Night 7000 Delhi
9711106444 Ghaziabad, Call Girls @ ₹. 1500– Per Shot Per Night 7000 Delhidelhimunirka15
 
WhatsApp Chat: 📞 8617370543 Call Girls In Siddharth Nagar At Low Cost Cash Pa...
WhatsApp Chat: 📞 8617370543 Call Girls In Siddharth Nagar At Low Cost Cash Pa...WhatsApp Chat: 📞 8617370543 Call Girls In Siddharth Nagar At Low Cost Cash Pa...
WhatsApp Chat: 📞 8617370543 Call Girls In Siddharth Nagar At Low Cost Cash Pa...Nitya salvi
 
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377087607
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377087607FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377087607
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377087607dollysharma2066
 
codes and conventions of film magazine and website.pptx
codes and conventions of film magazine and website.pptxcodes and conventions of film magazine and website.pptx
codes and conventions of film magazine and website.pptx17duffyc
 
Jaro je tady - Spring is here (Judith) 3
Jaro je tady - Spring is here (Judith) 3Jaro je tady - Spring is here (Judith) 3
Jaro je tady - Spring is here (Judith) 3wistariecz
 
Call Girls Sultanpur Just Call 📞 8617370543 Top Class Call Girl Service Avail...
Call Girls Sultanpur Just Call 📞 8617370543 Top Class Call Girl Service Avail...Call Girls Sultanpur Just Call 📞 8617370543 Top Class Call Girl Service Avail...
Call Girls Sultanpur Just Call 📞 8617370543 Top Class Call Girl Service Avail...Nitya salvi
 
一比一原版美国西雅图大学毕业证(Seattle毕业证书)毕业证成绩单留信认证
一比一原版美国西雅图大学毕业证(Seattle毕业证书)毕业证成绩单留信认证一比一原版美国西雅图大学毕业证(Seattle毕业证书)毕业证成绩单留信认证
一比一原版美国西雅图大学毕业证(Seattle毕业证书)毕业证成绩单留信认证khuurq8kz
 
Engineering Major for College_ Environmental Health Engineering by Slidesgo.pptx
Engineering Major for College_ Environmental Health Engineering by Slidesgo.pptxEngineering Major for College_ Environmental Health Engineering by Slidesgo.pptx
Engineering Major for College_ Environmental Health Engineering by Slidesgo.pptxDanielRemache4
 

Dernier (20)

Theoretical Framework- Explanation with Flow Chart.docx
Theoretical Framework- Explanation with Flow Chart.docxTheoretical Framework- Explanation with Flow Chart.docx
Theoretical Framework- Explanation with Flow Chart.docx
 
WhatsApp-(# 9711106444 #)Call Girl in Noida Sector 80 Noida (Escorts) Delhi
WhatsApp-(# 9711106444 #)Call Girl in Noida Sector 80 Noida (Escorts) DelhiWhatsApp-(# 9711106444 #)Call Girl in Noida Sector 80 Noida (Escorts) Delhi
WhatsApp-(# 9711106444 #)Call Girl in Noida Sector 80 Noida (Escorts) Delhi
 
Just Call Vip call girls Farrukhabad Escorts ☎️8617370543 Two shot with one g...
Just Call Vip call girls Farrukhabad Escorts ☎️8617370543 Two shot with one g...Just Call Vip call girls Farrukhabad Escorts ☎️8617370543 Two shot with one g...
Just Call Vip call girls Farrukhabad Escorts ☎️8617370543 Two shot with one g...
 
Call Girls In Dilshad Garden | Contact Me ☎ +91-9953040155
Call Girls In Dilshad Garden | Contact Me ☎ +91-9953040155Call Girls In Dilshad Garden | Contact Me ☎ +91-9953040155
Call Girls In Dilshad Garden | Contact Me ☎ +91-9953040155
 
Russian Call Girls In Bhubaneswar 📱 Odisha 9777949614 Indore
Russian Call Girls In Bhubaneswar 📱 Odisha 9777949614 IndoreRussian Call Girls In Bhubaneswar 📱 Odisha 9777949614 Indore
Russian Call Girls In Bhubaneswar 📱 Odisha 9777949614 Indore
 
SB_ Scott Pilgrim_ Rough_ RiverPhan (2024)
SB_ Scott Pilgrim_ Rough_ RiverPhan (2024)SB_ Scott Pilgrim_ Rough_ RiverPhan (2024)
SB_ Scott Pilgrim_ Rough_ RiverPhan (2024)
 
SB_ Pretzel and the puppies_ Rough_ RiverPhan (2024)
SB_ Pretzel and the puppies_ Rough_ RiverPhan (2024)SB_ Pretzel and the puppies_ Rough_ RiverPhan (2024)
SB_ Pretzel and the puppies_ Rough_ RiverPhan (2024)
 
Sui Generis Magazine volume one Kristen Murillo.pdf
Sui Generis Magazine volume one Kristen Murillo.pdfSui Generis Magazine volume one Kristen Murillo.pdf
Sui Generis Magazine volume one Kristen Murillo.pdf
 
Call Girls In Chattarpur | Contact Me ☎ +91-9953040155
Call Girls In Chattarpur | Contact Me ☎ +91-9953040155Call Girls In Chattarpur | Contact Me ☎ +91-9953040155
Call Girls In Chattarpur | Contact Me ☎ +91-9953040155
 
Call Girls Varanasi Just Call 8617370543Top Class Call Girl Service Available
Call Girls Varanasi Just Call 8617370543Top Class Call Girl Service AvailableCall Girls Varanasi Just Call 8617370543Top Class Call Girl Service Available
Call Girls Varanasi Just Call 8617370543Top Class Call Girl Service Available
 
Call Girls Aligarh Just Call 8617370543 Top Class Call Girl Service Available
Call Girls Aligarh Just Call 8617370543 Top Class Call Girl Service AvailableCall Girls Aligarh Just Call 8617370543 Top Class Call Girl Service Available
Call Girls Aligarh Just Call 8617370543 Top Class Call Girl Service Available
 
Call Girl In Chandigarh ☎ 08868886958✅ Just Genuine Call Call Girls Chandigar...
Call Girl In Chandigarh ☎ 08868886958✅ Just Genuine Call Call Girls Chandigar...Call Girl In Chandigarh ☎ 08868886958✅ Just Genuine Call Call Girls Chandigar...
Call Girl In Chandigarh ☎ 08868886958✅ Just Genuine Call Call Girls Chandigar...
 
9711106444 Ghaziabad, Call Girls @ ₹. 1500– Per Shot Per Night 7000 Delhi
9711106444 Ghaziabad, Call Girls @ ₹. 1500– Per Shot Per Night 7000 Delhi9711106444 Ghaziabad, Call Girls @ ₹. 1500– Per Shot Per Night 7000 Delhi
9711106444 Ghaziabad, Call Girls @ ₹. 1500– Per Shot Per Night 7000 Delhi
 
WhatsApp Chat: 📞 8617370543 Call Girls In Siddharth Nagar At Low Cost Cash Pa...
WhatsApp Chat: 📞 8617370543 Call Girls In Siddharth Nagar At Low Cost Cash Pa...WhatsApp Chat: 📞 8617370543 Call Girls In Siddharth Nagar At Low Cost Cash Pa...
WhatsApp Chat: 📞 8617370543 Call Girls In Siddharth Nagar At Low Cost Cash Pa...
 
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377087607
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377087607FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377087607
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377087607
 
codes and conventions of film magazine and website.pptx
codes and conventions of film magazine and website.pptxcodes and conventions of film magazine and website.pptx
codes and conventions of film magazine and website.pptx
 
Jaro je tady - Spring is here (Judith) 3
Jaro je tady - Spring is here (Judith) 3Jaro je tady - Spring is here (Judith) 3
Jaro je tady - Spring is here (Judith) 3
 
Call Girls Sultanpur Just Call 📞 8617370543 Top Class Call Girl Service Avail...
Call Girls Sultanpur Just Call 📞 8617370543 Top Class Call Girl Service Avail...Call Girls Sultanpur Just Call 📞 8617370543 Top Class Call Girl Service Avail...
Call Girls Sultanpur Just Call 📞 8617370543 Top Class Call Girl Service Avail...
 
一比一原版美国西雅图大学毕业证(Seattle毕业证书)毕业证成绩单留信认证
一比一原版美国西雅图大学毕业证(Seattle毕业证书)毕业证成绩单留信认证一比一原版美国西雅图大学毕业证(Seattle毕业证书)毕业证成绩单留信认证
一比一原版美国西雅图大学毕业证(Seattle毕业证书)毕业证成绩单留信认证
 
Engineering Major for College_ Environmental Health Engineering by Slidesgo.pptx
Engineering Major for College_ Environmental Health Engineering by Slidesgo.pptxEngineering Major for College_ Environmental Health Engineering by Slidesgo.pptx
Engineering Major for College_ Environmental Health Engineering by Slidesgo.pptx
 

Lecture11 xing

  • 1. Machine LearningMachine Learninggg Exact InferenceExact Inference Eric XingEric Xing Lecture 11, August 14, 2010 AAA B A C B A C B A C A DC A DC B AB A AA fm bmcm dm Eric Xing © Eric Xing @ CMU, 2006-2010 1 Reading: E F H E F H E F H E FE FE F E G E G E G A DC E A DC E A DC E DC DC hm gm em fm
  • 2. Inference and LearningInference and Learning  We now have compact representations of probability distributions: GM  A BN M describes a unique probability distribution P  Typical tasks:  Task 1: How do we answer queries about P?  We use inference as a name for the process of computing answers to such queries  Task 2: How do we estimate a plausible model M from data D? Task 2: How do we estimate a plausible model M from data D? i. We use learning as a name for the process of obtaining point estimate of M. ii. But for Bayesian, they seek p(M |D), which is actually an inference problem. Eric Xing © Eric Xing @ CMU, 2006-2010 2 iii. When not all variables are observable, even computing point estimate of M need to do inference to impute the missing data.
  • 3. Inferential Query 1: LikelihoodLikelihood  Most of the queries one may ask involve evidenceq y  Evidence xv is an assignment of values to a set Xv of nodes in the GM over varialbe set X={X1, X2, …, Xn}  Without loss of generality Xv={Xk+1, … , Xn},  Write XH=XXv as the set of hidden variables, XH can be or X  Simplest query: compute probability of evidence ∑ ∑ )(),,()( 1 1 x x k k ,,x,xPPP v x vHv xXXx H   Eric Xing © Eric Xing @ CMU, 2006-2010 3  this is often referred to as computing the likelihood of xv
  • 4. Inferential Query 2: Conditional Probability  Often we are interested in the conditional probability Conditional Probability p y distribution of a variable given the evidence VHVH xXxX xXX ),(),( )|( PP P  this is the a posteriori belief in XH, given evidence xv    Hx VHH VH V VH VVH xxXx xXX ),()( )|( PP P this is the a posteriori belief in XH, given evidence xv  We usually query a subset Y of all hidden variables XH={Y,Z} and "don't care" about the remaining Z:and don t care about the remaining, Z:   z VV xzZYxY )|,()|( PP Eric Xing © Eric Xing @ CMU, 2006-2010 4  the process of summing out the "don't care" variables z is called marginalization, and the resulting P(Y|xv) is called a marginal prob.
  • 5. Applications of a posteriori Belief  Prediction: what is the probability of an outcome given the starting ? Applications of a posteriori Belief condition  the query node is a descendent of the evidence A CB ? q y  Diagnosis: what is the probability of disease/fault given symptoms A CB ?  the query node an ancestor of the evidence  Learning under partial observation A CB g p  fill in the unobserved values under an "EM" setting (more later)  The directionality of information flow between variables is not Eric Xing © Eric Xing @ CMU, 2006-2010 5 y restricted by the directionality of the edges in a GM  probabilistic inference can combine evidence form all parts of the network
  • 6. Inferential Query 3: Most Probable Assignment  In this query we want to find the most probable joint Most Probable Assignment assignment (MPA) for some variables of interest  Such reasoning is usually performed under some given evidence xv, and ignoring (the values of) other variables Z:   z VyVyV xzZYxYxY )|,(maxarg)|(maxarg|* PP  this is the maximum a posteriori configuration of Y. Eric Xing © Eric Xing @ CMU, 2006-2010 6
  • 7. Complexity of Inference Thm: Complexity of Inference Thm: Computing P(XH=xH| xv) in an arbitrary GM is NP-hard  Hardness does not mean we cannot solve inference  It implies that we cannot find a general procedure that works efficiently for arbitrary GMsy  For particular families of GMs, we can have provably efficient procedures Eric Xing © Eric Xing @ CMU, 2006-2010 7
  • 8. Approaches to inferenceApproaches to inference  Exact inference algorithmsg  The elimination algorithm  Belief propagationp p g  The junction tree algorithms (but will not cover in detail here)  Approximate inference techniques Approximate inference techniques  Variational algorithmsVariational algorithms  Stochastic simulation / sampling methods  Markov chain Monte Carlo methods Eric Xing © Eric Xing @ CMU, 2006-2010 8
  • 9. Inference on General BN via Variable Elimination General idea: Variable Elimination  Write query in the form   ii paxPXP 1 )|(),( e  this suggests an "elimination order" of latent variables to be marginalized  Iteratively   nx x x i ii paxPXP 3 2 1 )|(),( e  Iteratively  Move all irrelevant terms outside of innermost sum  Perform innermost sum, getting a new term I t th t i t th d t Insert the new term into the product  wrap-up ),( )|( eXP XP 1 Eric Xing © Eric Xing @ CMU, 2006-2010 9 )( ),( )|( e e P XP 1 1 
  • 10. Hidden Markov ModelHidden Markov Model y2 y3y1 yT... p(x y) = p(x x y y ) A AA Ax2 x3x1 xT y2 y3y1 yT... ... p(x, y) = p(x1……xT, y1, ……, yT) = p(y1) p(x1 | y1) p(y2 | y1) p(x2 | y2) … p(yT | yT-1) p(xT | yT) C diti l b bilitConditional probability: Eric Xing © Eric Xing @ CMU, 2006-2010 10
  • 11. Hidden Markov ModelHidden Markov Model y2 y3y1 yT... Conditional probability: A AA Ax2 x3x1 xT y2 y3y1 yT... ... Eric Xing © Eric Xing @ CMU, 2006-2010 11
  • 12. A Bayesian network A food web A Bayesian network A food web B A DC E FE F G H Eric Xing © Eric Xing @ CMU, 2006-2010 12 What is the probability that hawks are leaving given that the grass condition is poor?
  • 13. Example: Variable Elimination  Query: P(A |h) B A Example: Variable Elimination  Need to eliminate: B,C,D,E,F,G,H  Initial factors: B A DC  Choose an elimination order: H,G,F,E,D,C,B E F G H ),|()|()|(),|()|()|()()( fehPegPafPdcePadPbcPbPaP  Step 1:  Conditioning (fix the evidence node (i.e., h) on its observed value (i.e., )):h ~  This step is isomorphic to a marginalization step: ),| ~ (),( fehhpfemh  B A DC Eric Xing © Eric Xing @ CMU, 2006-2010 13   h h hhfehpfem ) ~ (),|(),(  E F G
  • 14. Example: Variable Elimination  Query: P(B |h) B A Example: Variable Elimination  Need to eliminate: B,C,D,E,F,G  Initial factors: B A DC E F G H ),()|()|(),|()|()|()()( ),|()|()|(),|()|()|()()( femegPafPdcePadPbcPbPaP fehPegPafPdcePadPbcPbPaP h  Step 2: Eliminate G t compute 1)|()(  g g egpem B A DC)()()|()|()|()|()()( fememafPdcePadPbcPbPaP h Eric Xing © Eric Xing @ CMU, 2006-2010 14 E F ),()|(),|()|()|()()( ),()()|(),|()|()|()()( femafPdcePadPbcPbPaP fememafPdcePadPbcPbPaP h hg  
  • 15. Example: Variable Elimination  Query: P(B |h) B A Example: Variable Elimination  Need to eliminate: B,C,D,E,F  Initial factors: B A DC E F G H),()|(),|()|()|()()( ),()|()|(),|()|()|()()( ),|()|()|(),|()|()|()()( femafPdcePadPbcPbPaP femegPafPdcePadPbcPbPaP fehPegPafPdcePadPbcPbPaP h h    Step 3: Eliminate F t ),()|(),|()|()|()()( ff h  compute  f hf femafpaem ),()|(),( )()|()|()|()()( eamdcePadPbcPbPaP B A DC Eric Xing © Eric Xing @ CMU, 2006-2010 15 ),(),|()|()|()()( eamdcePadPbcPbPaP f E
  • 16. Example: Variable Elimination  Query: P(B |h) B A Example: Variable Elimination  Need to eliminate: B,C,D,E  Initial factors: B A DC E F G H),()|(),|()|()|()()( ),()|()|(),|()|()|()()( ),|()|()|(),|()|()|()()( femafPdcePadPbcPbPaP femegPafPdcePadPbcPbPaP fehPegPafPdcePadPbcPbPaP h h    Step 4: Eliminate E t ),(),|()|()|()()( ),()|(),|()|()|()()( eamdcePadPbcPbPaP ff f h  B A DC  compute  e fe eamdcepdcam ),(),|(),,( )()|()|()()( dcamadPbcPbPaP B A DC Eric Xing © Eric Xing @ CMU, 2006-2010 16 E ),,()|()|()()( dcamadPbcPbPaP e
  • 17. Example: Variable Elimination  Query: P(B |h) B A Example: Variable Elimination  Need to eliminate: B,C,D  Initial factors: B A DC E F G H),()|(),|()|()|()()( ),()|()|(),|()|()|()()( ),|()|()|(),|()|()|()()( femafPdcePadPbcPbPaP femegPafPdcePadPbcPbPaP fehPegPafPdcePadPbcPbPaP h h   ),,()|()|()()( ),(),|()|()|()()( dcamadPbcPbPaP eamdcePadPbcPbPaP e f    Step 5: Eliminate D  compute  ed dcamadpcam ),,()|(),( B A C Eric Xing © Eric Xing @ CMU, 2006-2010 17 d ed p ),,()|(),( ),()|()()( camdcPbPaP d
  • 18. Example: Variable Elimination  Query: P(B |h) B A Example: Variable Elimination  Need to eliminate: B,C  Initial factors: B A DC E F G H),()|(),|()|()|()()( ),()|()|(),|()|()|()()( ),|()|()|(),|()|()|()()( femafPdcePadPdcPbPaP femegPafPdcePadPdcPbPaP fehPegPafPdcePadPdcPbPaP h h   ),()|()()( ),,()|()|()()( ),(),|()|()|()()( camdcPbPaP dcamadPdcPbPaP eamdcePadPdcPbPaP d e f     Step 6: Eliminate C  compute  dc cambcpbam ),()|(),( B A Eric Xing © Eric Xing @ CMU, 2006-2010 18 ),()|()()( camdcPbPaP d c dc p ),()|(),(
  • 19. Example: Variable Elimination  Query: P(B |h) B A Example: Variable Elimination  Need to eliminate: B  Initial factors: B A DC E F G H),()|(),|()|()|()()( ),()|()|(),|()|()|()()( ),|()|()|(),|()|()|()()( femafPdcePadPdcPbPaP femegPafPdcePadPdcPbPaP fehPegPafPdcePadPdcPbPaP h h   ),()|()()( ),,()|()|()()( ),(),|()|()|()()( camdcPbPaP dcamadPdcPbPaP eamdcePadPdcPbPaP d e f     Step 7: Eliminate B  compute ),()()( bambPaP c  cb bambpam ),()()( A Eric Xing © Eric Xing @ CMU, 2006-2010 19 b cb p ),()()( )()( amaP b
  • 20. Example: Variable Elimination  Query: P(B |h) B A Example: Variable Elimination  Need to eliminate: B  Initial factors: B A DC E F G H),()|(),|()|()|()()( ),()|()|(),|()|()|()()( ),|()|()|(),|()|()|()()( femafPdcePadPdcPbPaP femegPafPdcePadPdcPbPaP fehPegPafPdcePadPdcPbPaP h h   ),()|()()( ),,()|()|()()( ),(),|()|()|()()( camdcPbPaP dcamadPdcPbPaP eamdcePadPdcPbPaP d e f     Step 8: Wrap-up )()( ),()()( amaP bambPaP b c   ,)()() ~ ,( amaphap b  b amaphp )()() ~ ( Eric Xing © Eric Xing @ CMU, 2006-2010 20 Step 8 ap up ,)()(),( pp b   a b b amap amap haP )()( )()( ) ~ |( a b amaphp )()()(
  • 21. Complexity of variable elimination  Suppose in one elimination step we compute elimination  x kxkx yyxmyym ),,,('),,( 11   k xmyyxm )()(' y This requires  multiplications   i cikx i xmyyxm 1 1 ),(),,,( y  CXk )Val()Val( Y p ─ For each value of x, y1, …, yk, we do k multiplications i Ci )()(  additions ─ For each value of y1, …, yk , we do |Val(X)| additions  i Ci X )Val()Val( Y Eric Xing © Eric Xing @ CMU, 2006-2010 21 Complexity is exponential in number of variables in the intermediate factor
  • 22. Elimination CliquesElimination Cliques B AB A DC E F G H B A DC E F G H B A DC B A DC G H B A DC B A DCD E F G H D E F G DC E F DC E )( fem )(em )( aem )( dcam B A DC B A C B A A ),( femh )(emg ),( aemf ),,( dcame Eric Xing © Eric Xing @ CMU, 2006-2010 22 D ),( camd ),( bamc )(amb
  • 23. Understanding Variable EliminationElimination  A graph elimination algorithm B A DC E F B A DC E F B A DC B A DC E F B A DC E F B A DC E B A C B A A moralization G H G H G graph elimination  Intermediate terms correspond to the cliques resulted from elimination  “good” elimination orderings lead to small cliques and hence reduce complexitygood elimination orderings lead to small cliques and hence reduce complexity (what will happen if we eliminate "e" first in the above graph?)  finding the optimum ordering is NP-hard, but for many graph optimum or near- optimum can often be heuristically found Eric Xing © Eric Xing @ CMU, 2006-2010 23 p y  Applies to undirected GMs
  • 24. From Elimination to Belief PropagationPropagation  Recall that Induced dependency during marginalization isp y g g captured in elimination cliques  Summation <-> elimination  Intermediate term <-> elimination cliqueq A B A C A B A A A E F A DC A DC Eric Xing © Eric Xing @ CMU, 2006-2010 24  Can this lead to an generic inference algorithm? E F H E G E
  • 25. Tree GMsTree GMs Undirected tree: a unique path between Directed tree: all nodes except the root ha e e actl one Poly tree: can have multiple parents Eric Xing © Eric Xing @ CMU, 2006-2010 25 any pair of nodes have exactly one parent
  • 26. Equivalence of directed and undirected trees  Any undirected tree can be converted to a directed tree by choosing a root undirected trees node and directing all edges away from it  A directed tree and the corresponding undirected tree make the same conditional independence assertionsp  Parameterizations are essentially the same.  Undirected tree: Undirected tree:  Directed tree:  Equivalence: Eric Xing © Eric Xing @ CMU, 2006-2010 26  Evidence:?
  • 27. From elimination to message passingpassing  Recall ELIMINATION algorithm:  Choose an ordering Z in which query node f is the final node  Place all potentials on an active list  Eliminate node i by removing all potentials containing i, take sum/product over xi. Place the resultant factor back on the list Place the resultant factor back on the list  For a TREE graph:  Choose query node f as the root of the tree  View tree as a directed tree with edges pointing towards from f  Elimination ordering based on depth-first traversal  Elimination of each node can be considered as message-passing (or Belief Propagation) Elimination of each node can be considered as message passing (or Belief Propagation) directly along tree branches, rather than on some transformed graphs  thus, we can use the tree itself as a data-structure to do general inference!! Eric Xing © Eric Xing @ CMU, 2006-2010 27
  • 28. Message passing for treesMessage passing for trees Let m (x ) denote the factor resulting from f Let mij(xi) denote the factor resulting from eliminating variables from bellow up to i, which is a function of xi: i This is reminiscent of a message sent from j to i. j k l Eric Xing © Eric Xing @ CMU, 2006-2010 28 k l mij(xi) represents a "belief" of xi from xj!
  • 29.  Elimination on trees is equivalent to message passing alongq g p g g tree branches! f ii j Eric Xing © Eric Xing @ CMU, 2006-2010 29 k l
  • 30. The message passing protocol:The message passing protocol:  A two-pass algorithm:p g X1 (X ) (X ) X m21(X 1) m32(X 2) m42(X 2) m12(X 2) X2 X3 X4 m32(X 2) m42(X 2) Eric Xing © Eric Xing @ CMU, 2006-2010 30 m24(X 4) 3 m23(X 3)
  • 31. Belief Propagation (SP-algorithm): Sequential implementationSequential implementation Eric Xing © Eric Xing @ CMU, 2006-2010 31
  • 32. Belief Propagation (SP-algorithm): Parallel synchronous implementationParallel synchronous implementation  For a node of degree d, whenever messages have arrived on any subset of d-1 node, compute the message for the remaining edge and send! Eric Xing © Eric Xing @ CMU, 2006-2010 32 compute the message for the remaining edge and send!  A pair of messages have been computed for each edge, one for each direction  All incoming messages are eventually computed for each node
  • 33. Correctness of BP on treeCorrectness of BP on tree  Collollary: the synchronous implementation is "non-blocking"  Thm: The Message Passage Guarantees obtaining all marginals in the tree  What about non-tree? Eric Xing © Eric Xing @ CMU, 2006-2010 33
  • 34. Inference on general GMInference on general GM  Now, what if the GM is not a tree-like graph?  Can we still directly run message i t l l it d ?message-passing protocol along its edges?  For non-trees, we do not have the guarantee that message-passingg g p g will be consistent!  Then what? Then what?  Construct a graph data-structure from P that has a tree structure, and run message-passing on it! Eric Xing © Eric Xing @ CMU, 2006-2010 34  Junction tree algorithm
  • 35. Elimination Clique  Recall that Induced dependency during marginalization is Elimination Clique p y g g captured in elimination cliques  Summation <-> elimination  Intermediate term <-> elimination cliqueq A B A C A B A A A E F A DC A DC Eric Xing © Eric Xing @ CMU, 2006-2010 35  Can this lead to an generic inference algorithm? E F H E G E
  • 36. A Clique TreeA Clique Tree B A C B A A bmcm A C A DC fm dm E F A DC E hm em E F H E G E h gm Eric Xing © Eric Xing @ CMU, 2006-2010 36  e fg e eamemdcep dcam ),()(),|( ),,(
  • 37. From Elimination to Message Passing  Elimination  message passing on a clique tree Passing  Elimination  message passing on a clique tree B A DC E F B A DC E F B A DC B A DC E F B A DC E F B A DC E B A C B A A B A C B A A bmcm G H G H G  A E F A DC A DC em fm dm  e dcam ),,( E F H E G DC E hm gm  e fg eamemdcep ),()(),|( Eric Xing © Eric Xing @ CMU, 2006-2010 37  Messages can be reused
  • 38. From Elimination to Message PassingPassing  Elimination  message passing on a clique tree Elimination  message passing on a clique tree  Another query ... B A C B A A cm bm A E F A DC A DC em dm fm E F H E G DC E gm hm Eric Xing © Eric Xing @ CMU, 2006-2010 38  Messages mf and mh are reused, others need to be recomputed
  • 39. The Shafer Shenoy AlgorithmThe Shafer Shenoy Algorithm  Shafer-Shenoy algorithmy g  Message from clique i to clique j :   S )(  Clique marginal     iji i SC jk kiikCji S )(  SCp )()(  Eric Xing © Eric Xing @ CMU, 2006-2010 39   k kiikCi SCp i )()( 
  • 40. A Sketch of the Junction Tree AlgorithmAlgorithm  The algorithmg  Construction of junction trees --- a special clique tree  Propagation of probabilities --- a message-passing protocol  Results in marginal probabilities of all cliques --- solves all queries in a single run  A generic exact inference algorithm for any GM A generic exact inference algorithm for any GM  Complexity: exponential in the size of the maximal clique --- a good elimination order often leads to small maximal clique, and hence a good (i.e., thin) JT  Many well-known algorithms are special cases of JT Eric Xing © Eric Xing @ CMU, 2006-2010 40  Forward-backward, Kalman filter, Peeling, Sum-Product ...
  • 41. The Junction tree algorithm for HMMThe Junction tree algorithm for HMM  A junction tree for the HMM ),( 11 xy ),( 21 yy ),( 32 yy ),( TT yy 1 A AA Ax2 x3x1 xT y2 y3y1 yT ... ... y yy yy yy )( 2y )( 3y )( Ty )( 1y )( 2y   Rightward pass ),( 22 xy ),( 33 xy ),( TT xy ),( 1tt yy)( ttt y1 )( 11  ttt y  )|()()|(    ty tttttttttt yyyyy )()(),()( 11111   This is exactly the forward algorithm! ),( 11  tt xy )( 1 tt y        t tt t y tttyytt y ttttttt yayxp yxpyyyp )()|( )|()()|( , 111 1111 1   y g  Leftward pass …     1 11111 ty tttttttttt yyyyy )()(),()(  ),( 1tt yy)( ttt y1 )( 11  ttt y )( 1 ty Eric Xing © Eric Xing @ CMU, 2006-2010 41  This is exactly the backward algorithm! 1ty    1 11111 ty ttttttt yxpyyyp )|()()|(  ),( 11  tt xy )( 1 tt y
  • 42. SummarySummary  The simple Eliminate algorithm captures the key algorithmic Operation underlying probabilistic inference: --- That of taking a sum over product of potential functions  The computational complexity of the Eliminate algorithm can be reduced to purely graph-theoretic considerations.  This graph interpretation will also provide hints about how to design improved inference algorithms  What can we say about the overall computational complexity of the algorithm? In particular, how can we control the "size" of the Eric Xing © Eric Xing @ CMU, 2006-2010 42 summands that appear in the sequence of summation operation.