SlideShare une entreprise Scribd logo
1  sur  37
Télécharger pour lire hors ligne
Approximative Bayesian Computation (ABC) Methods




                     Approximative Bayesian Computation
                              (ABC) Methods

                                         Christian P. Robert

                              Universit´ Paris Dauphine and CREST-INSEE
                                       e
                              http://www.ceremade.dauphine.fr/~xian


            Joint works with M. Beaumont, J.-M. Cornuet, A. Grelaud,
                     J.-M. Marin, F. Rodolphe, & J.-F. Tally
             Colloquium, Universit´ de Montpellier 2, 12 f´vrier 2009
                                  e                       e
Approximative Bayesian Computation (ABC) Methods




Outline


          Introduction
      1


          Population Monte Carlo
      2


          ABC
      3


          ABC-PMC
      4


          ABC for model choice in GRFs
      5
Approximative Bayesian Computation (ABC) Methods
  Introduction




General purpose



      Given a density π known up to a normalizing constant, and an
      integrable function h, compute

                                                        h(x)˜ (x)µ(dx)
                                                             π
                     Π(h) =           h(x)π(x)µ(dx) =
                                                          π (x)µ(dx)
                                                          ˜

      when        h(x)˜ (x)µ(dx) is intractable.
                      π
Approximative Bayesian Computation (ABC) Methods
  Introduction
     Monte Carlo basics



Monte Carlo basics
      Generate an iid sample x1 , . . . , xN from π and estimate Π(h) by
                                                       N
                                     ˆN
                                     ΠM C (h) = N −1         h(xi ).
                                                       i=1

                     as
           ˆN
      LLN: ΠM C (h) −→ Π(h)
      If Π(h2 ) =             h2 (x)π(x)µ(dx) < ∞,
                          √                        L
                                ˆN
                              N ΠM C (h) − Π(h)         N 0, Π [h − Π(h)]2   .
           CLT:


      Caveat
      Often impossible or inefficient to simulate directly from Π
Approximative Bayesian Computation (ABC) Methods
  Introduction
     Importance Sampling



Importance Sampling

      For Q proposal distribution such that Q(dx) = q(x)µ(dx),
      alternative representation

                              Π(h) =          h(x){π/q}(x)q(x)µ(dx).


      Principle
      Generate an iid sample x1 , . . . , xN ∼ Q and estimate Π(h) by
                                                   N
                             ˆ IS
                             ΠQ,N (h) = N −1             h(xi ){π/q}(xi ).
                                                   i=1
Approximative Bayesian Computation (ABC) Methods
  Introduction
     Importance Sampling




      Then
                    as
           ˆ
      LLN: ΠIS (h) −→ Π(h)                         and if Q((hπ/q)2 ) < ∞,
             Q,N

                      √                              L
                              ˆ Q,N
                           N (ΠIS (h) − Π(h))            N 0, Q{(hπ/q − Π(h))2 } .
       CLT :


      Caveat
                                                         ˆ Q,N
      If normalizing constant unknown, impossible to use ΠIS

      Generic problem in Bayesian Statistics: π(θ|x) ∝ f (x|θ)π(θ).
Approximative Bayesian Computation (ABC) Methods
  Introduction
     Importance Sampling



Self-Normalised Importance Sampling

      Self normalized version
                                                           −1 N
                                         N
                 ˆ Q,N
                 ΠSN IS (h)                   {π/q}(xi )
                                 =                                   h(xi ){π/q}(xi ).
                                        i=1                    i=1

                                 as
                     ˆ
                     ΠSN IS (h) −→ Π(h)
      LLN :            Q,N

      and if Π((1 + h2 )(π/q)) < ∞,
               √                                      L
                                                                0, π {(π/q)(h − Π(h)}2 ) .
                    ˆ Q,N
                 N (ΠSN IS (h) − Π(h))
      CLT :                                                N

       c The quality of the SNIS approximation depends on the
      choice of Q
Approximative Bayesian Computation (ABC) Methods
  Introduction
     Importance Sampling



Iterated importance sampling

      Introduction of an algorithmic temporal dimension :
                       (t)              (t−1)
                     xi ∼ qt (x|xi                )             i = 1, . . . , n,    t = 1, . . .

      and
                                                            n
                                             1                        (t)   (t)
                                        ˆ
                                        It =                      ̺i h(xi )
                                             n
                                                            i=1

      is still unbiased for
                                                      (t)
                                           πt (xi )
                             (t)
                             ̺i =                                     ,     i = 1, . . . , n
                                            (t)        (t−1)
                                      qt (xi |xi                  )
Approximative Bayesian Computation (ABC) Methods
  Population Monte Carlo




      PMCA: Population Monte Carlo Algorithm
      At time t = 0
                                               iid
             Generate (xi,0 )1≤i≤N ∼ Q0
             Set ωi,0 = {π/q0 }(xi,0 )
                                               iid
             Generate (Ji,0 )1≤i≤N ∼ M(1, (¯ i,0 )1≤i≤N )
                                           ω
             Set xi,0 = xJi ,0
                 ˜
      At time t (t = 1, . . . , T ),
                                  ind
             Generate xi,t ∼ Qi,t (˜i,t−1 , ·)
                                    x
             Set ωi,t = {π(xi,t )/qi,t (˜i,t−1 , xi,t )}
                                        x
                                              iid
             Generate (Ji,t )1≤i≤N ∼ M(1, (¯ i,t )1≤i≤N )
                                           ω
             Set xi,t = xJi,t ,t .
                 ˜

                                         [Capp´, Douc, Guillin, Marin, & CPR, 2009]
                                              e
Approximative Bayesian Computation (ABC) Methods
  Population Monte Carlo




Notes on PMC

      After T iterations of PMCA, PMC estimator of Π(h) given by
                                                   T   N
                                            1
                               ¯ N,T
                               ΠP M C (h) =                  ¯
                                                             ωi,t h(xi,t ).
                                            T
                                                   t=1 i=1


             ¯
             ωi,t means normalising over whole sequence of simulations
         1


             Qi,t ’s chosen arbitrarily under support constraint
         2


             Qi,t ’s may depend on whole sequence of simulations
         3


             alternatives to multinomial sampling reduce variance/preserve
         4

             “unbiasedness”
                       [Kitagawa, 1996 / Carpenter, Clifford & Fearnhead, 1997]
Approximative Bayesian Computation (ABC) Methods
  ABC




The ABC method

      Bayesian setting: target is π(θ)f (x|θ)
      When likelihood f (x|θ) not in closed form, likelihood-free rejection
      technique:
      ABC algorithm
      For an observation y ∼ f (y|θ), under the prior π(θ), keep jointly
      simulating
                           θ′ ∼ π(θ) , x ∼ f (x|θ′ ) ,
      until the auxiliary variable x is equal to the observed value, x = y.

                                                    [Pritchard et al., 1999]
Approximative Bayesian Computation (ABC) Methods
  ABC




Population genetics example
      Tree of ancestors in a sample of genes
Approximative Bayesian Computation (ABC) Methods
  ABC




A as approximative


      When y is a continuous random variable, equality x = y is replaced
      with a tolerance condition,

                                                   ̺(x, y) ≤ ǫ

      where ̺ is a distance between summary statistics
      Output distributed from

                           π(θ) Pθ {̺(x, y) < ǫ} ∝ π(θ|̺(x, y) < ǫ)
Approximative Bayesian Computation (ABC) Methods
  ABC




ABC improvements


      Simulating from the prior is often poor in efficiency
      Either modify the proposal distribution on θ to increase the density
      of x’s within the vicinity of y...
           [Marjoram et al, 2003; Bortot et al., 2007, Sisson et al., 2007]

      ...or by viewing the problem as a conditional density estimation
      and by developing techniques to allow for larger ǫ
                                                  [Beaumont et al., 2002]
Approximative Bayesian Computation (ABC) Methods
  ABC




ABC-MCMC


      Markov chain (θ(t) ) created via the transition function
                 
                 θ′ ∼ K(θ′ |θ(t) ) if x ∼ f (x|θ′ ) is such that x = y
                 
                 
                                                            π(θ′ )K(θ(t) |θ′ )
         (t+1)
                                      and u ∼ U(0, 1) ≤ π(θ(t) )K(θ′ |θ(t) ) ,
       θ       =
                 
                  (t)
                 θ                  otherwise,

      has the posterior π(θ|y) as stationary distribution
                                                    [Marjoram et al, 2003]
Approximative Bayesian Computation (ABC) Methods
  ABC




ABC-PRC

      Another sequential version producing a sequence of Markov
                                             (t)          (t)
      transition kernels Kt and of samples (θ1 , . . . , θN ) (1 ≤ t ≤ T )

      ABC-PRC Algorithm
                                                                            (t−1)
             Pick a θ⋆ is selected at random among the previous θi                  ’s
         1
                                 (t−1)
                                       (1 ≤ i ≤ N ).
             with probabilities ωi
             Generate
         2
                                       (t)                       (t)
                                     θi ∼ Kt (θ|θ⋆ ) , x ∼ f (x|θi ) ,
             Check that ̺(x, y) < ǫ, otherwise start again.
         3



                                                                 [Sisson et al., 2007]
Approximative Bayesian Computation (ABC) Methods
  ABC




ABC-PRC weight

                            (t)
      Probability ωi              computed as
                      (t)           (t)            (t)    (t)
                   ωi ∝ π(θi )Lt−1 (θ⋆ |θi ){π(θ⋆ )Kt (θi |θ⋆ )}−1 ,

       where Lt−1 is an arbitrary transition kernel.
      In case
                            Lt−1 (θ′ |θ) = Kt (θ|θ′ ) ,
      all weights are equal under a uniform prior.
      Inspired from Del Moral et al. (2006), who use backward kernels
      Lt−1 in SMC to achieve unbiasedness
Approximative Bayesian Computation (ABC) Methods
  ABC




ABC-PRC bias
                                       Lack of unbiasedness of the method

      Joint density of the accepted pair (θ(t−1) , θ(t) ) proportional to
                                                            (t−1)              (t)        (t−1)             (t)
                                                                    |y)Kt (θ         |θ           )f (y|θ         ),
                                                      π(θ




      For an arbitrary function h(θ), E[ωt h(θ(t) )] proportional to
                                π(θ (t) )Lt−1 (θ (t−1) |θ (t) )
           ZZ
                      (t)                                                     (t−1)                  (t)        (t−1)             (t)         (t−1)        (t)
                            )                                                             |y)Kt (θ         |θ           )f (y|θ         )dθ           dθ
                h(θ                                                     π(θ
                                π(θ (t−1) )Kt (θ (t) |θ (t−1) )
                                             π(θ (t) )Lt−1 (θ (t−1) |θ (t) )
                    ZZ
                                   (t)                                                     (t−1)                (t−1)
                ∝                        )                                                          )f (y|θ             )
                            h(θ                                                      π(θ
                                             π(θ (t−1) )Kt (θ (t) |θ (t−1) )
                                       (t)        (t−1)        (t)     (t−1)         (t)
                        × Kt (θ              |θ         )f (y|θ )dθ       dθ
                                                        Z                                           ff
                    Z
                                 (t)          (t)                    (t−1) (t)       (t−1)     (t−1)      (t)
                ∝                      )π(θ         |y)      Lt−1 (θ      |θ )f (y|θ       )dθ         dθ
                         h(θ                                                                                  .
Approximative Bayesian Computation (ABC) Methods
  ABC




A mixture example
            0.8




                                      0.8




                                                                0.8




                                                                                          0.8




                                                                                                                    0.8
            0.4




                                      0.4




                                                                0.4




                                                                                          0.4




                                                                                                                    0.4
            0.0




                                      0.0




                                                                0.0




                                                                                          0.0




                                                                                                                    0.0
                  −3   −1       123         −3   −1       123         −3   −1       123         −3   −1       123         −3   −1       123

                            θ                         θ                         θ                         θ                         θ
            0.8




                                      0.8




                                                                0.8




                                                                                          0.8




                                                                                                                    0.8
            0.4




                                      0.4




                                                                0.4




                                                                                          0.4




                                                                                                                    0.4
            0.0




                                      0.0




                                                                0.0




                                                                                          0.0




                                                                                                                    0.0
                  −3   −1       123         −3   −1       123         −3   −1       123         −3   −1       123         −3   −1       123

                            θ                         θ                         θ                         θ                         θ

                            Comparison of τ = 0.15 and τ = 1/0.15 in Kt
Approximative Bayesian Computation (ABC) Methods
  ABC-PMC




A PMC version
      Use of the same kernel idea as ABC-PRC but with IS correction
      Generate a sample at iteration t by
                                                   N
                                                        (t−1)              (t−1)
                              πt (θ(t) ) ∝                      Kt (θ(t) |θj
                              ˆ                        ωj                          )
                                                j=1

      modulo acceptance of the associated xt , and use an importance
                                                     (t)
      weight associated with an accepted simulation θi
                                          (t)           (t)          (t)
                                       ωi ∝ π(θi ) πt (θi ) .
                                                   ˆ

       c Still likelihood free
                                                [Beaumont et al., 2008, arXiv:0805.2256]
Approximative Bayesian Computation (ABC) Methods
  ABC-PMC




The ABC-PMC algorithm
      Given a decreasing sequence of approximation levels ǫ1 ≥ . . . ≥ ǫT ,
         1. At iteration t = 1,
                     For i = 1, ..., N
                                          (1)                           (1)
                            Simulate θi ∼ π(θ) and x ∼ f (x|θi ) until ̺(x, y) < ǫ1
                                 (1)
                            Set ωi = 1/N
                                                                              (1)
             Take τ 2 as twice the empirical variance of the θi ’s
         2. At iteration 2 ≤ t ≤ T ,
                     For i = 1, ..., N , repeat
                                                   (t−1)                             (t−1)
                                  ⋆
                            Pick θi from the θj            ’s with probabilities ωj
                                         (t)                                        (t)
                                                       2
                                          ⋆       ⋆
                            generate θi |θi ∼ N (θi , σt ) and x ∼ f (x|θi )
                     until ̺(x, y) < ǫt
                           (t)      (t)                    (t−1)              (t)         (t−1)
                                                   N
                     Set ωi ∝ π(θi )/                                    θi − θj
                                                         ωj        ϕ σt                           )
                                                                      −1
                                                   j=1

                                                                                                  (t)
                   2
             Take τt+1 as twice the weighted empirical variance of the θi ’s
Approximative Bayesian Computation (ABC) Methods
  ABC-PMC




A mixture example (0)

      Toy model of Sisson et al. (2007): if

              θ ∼ U(−10, 10) ,              x|θ ∼ 0.5 N (θ, 1) + 0.5 N (θ, 1/100) ,

      then the posterior distribution associated with y = 0 is the normal
      mixture
                   θ|y = 0 ∼ 0.5 N (0, 1) + 0.5 N (0, 1/100)
      restricted to [−10, 10].
      Furthermore, true target available as

      π(θ||x| < ǫ) ∝ Φ(ǫ−θ)−Φ(−ǫ−θ)+Φ(10(ǫ−θ))−Φ(−10(ǫ+θ)) .
Approximative Bayesian Computation (ABC) Methods
  ABC-PMC




A mixture example (2)
      Recovery of the target, whether using a fixed standard deviation of
      τ = 0.15 or τ = 1/0.15, or a sequence of adaptive τt ’s.
Approximative Bayesian Computation (ABC) Methods
  ABC for model choice in GRFs




ABC for model choice

           Introduction
      1


           Population Monte Carlo
      2


           ABC
      3


           ABC-PMC
      4


           ABC for model choice in GRFs
      5
            Gibbs random fields
            Model choice via ABC
            Illustrations
Approximative Bayesian Computation (ABC) Methods
  ABC for model choice in GRFs
     Gibbs random fields



Gibbs random fields


      Gibbs distribution
      y = (y1 , . . . , yn ) is a Gibbs random field associated with the
      graph G if
                                     1
                             f (y) = exp −       Vc (yc ) .
                                     Z
                                                   c∈C

      where Z is the normalising constant, C is the set of cliques and Vc
      is any function also called potential (and U (y) = c∈C Vc (yc ) is
      the energy function)

       c Z is usually unavailable in closed form
Approximative Bayesian Computation (ABC) Methods
  ABC for model choice in GRFs
     Gibbs random fields



Potts model
      Potts model
      Vc (y) is of the form

                                          Vc (y) = θS(y)
                                                    =θ           δyl =yi
                                                           l∼i

      where l∼i denotes a neighbourhood structure

      In most realistic settings, summation
                                                         exp{θT S(x)}
                                       Zθ =
                                                   x∈X
      involves too many terms to be manageable and numerical
      approximations cannot always be trusted
                             [Cucala, Marin, CPR & Titterington, 2009]
Approximative Bayesian Computation (ABC) Methods
  ABC for model choice in GRFs
     Model choice via ABC



Bayesian Model Choice

      Comparing a model with potential S0 taking values in Rp0 versus a
      model with potential S1 taking values in Rp1 through the Bayes
      factor corresponding to the priors π0 and π1 on each parameter
      space

                                                    T
                    Bm0 /m1 (x) =              exp{θ0 S0 (x)}/Zθ0 ,0 π0 (dθ0 )

                                           T
                                      exp{θ1 S1 (x)}/Zθ1 ,1 π1 (dθ1 )

      Use of Jeffreys’ scale to select most appropriate model
Approximative Bayesian Computation (ABC) Methods
  ABC for model choice in GRFs
     Model choice via ABC



Neighbourhood relations



      Choice to be made between M neighbourhood relations
                                      m
                                   i ∼ i′          (0 ≤ m ≤ M − 1)

      with
                                          Sm (x) =          I{xi =xi′ }
                                                      m
                                                     i∼i′

      driven by the posterior probabilities of the models.
Approximative Bayesian Computation (ABC) Methods
  ABC for model choice in GRFs
     Model choice via ABC



Model index



      Formalisation via a model index M that appears as a new
      parameter with prior distribution π(M = m) and
      π(θ|M = m) = πm (θm )
      Computational target:

              P(M = m|x) ∝                     fm (x|θm )πm (θm ) dθm π(M = m) ,
                                          Θm
Approximative Bayesian Computation (ABC) Methods
  ABC for model choice in GRFs
     Model choice via ABC



Sufficient statistics
      By definition, if S(x) sufficient statistic for the joint parameters
      (M, θ0 , . . . , θM −1 ),
                                 P(M = m|x) = P(M = m|S(x)) .
      For each model m, own sufficient statistic Sm (·) and
      S(·) = (S0 (·), . . . , SM −1 (·)) also sufficient.
      For Gibbs random fields,
                                        1           2
                x|M = m ∼ fm (x|θm ) = fm (x|S(x))fm (S(x)|θm )
                                           1
                                               f 2 (S(x)|θm )
                                     =
                                       n(S(x)) m
      where
                             n(S(x)) = ♯ {˜ ∈ X : S(˜ ) = S(x)}
                                          x         x
       c S(x) is therefore also sufficient for the joint parameters
                                     [Specific to Gibbs random fields!]
Approximative Bayesian Computation (ABC) Methods
  ABC for model choice in GRFs
     Model choice via ABC



ABC model choice Algorithm


      ABC-MC
             Generate m∗ from the prior π(M = m).
                       ∗
             Generate θm∗ from the prior πm∗ (·).
             Generate x∗ from the model fm∗ (·|θm∗ ).
                                                ∗

             Compute the distance ρ(S(x0 ), S(x∗ )).
             Accept (θm∗ , m∗ ) if ρ(S(x0 ), S(x∗ )) < ǫ.
                      ∗


      Note When ǫ = 0 the algorithm is exact
Approximative Bayesian Computation (ABC) Methods
  ABC for model choice in GRFs
     Model choice via ABC



ABC approximation to the Bayes factor

      Frequency ratio:
                                                   ˆ
                                                   P(M = m0 |x0 ) π(M = m1 )
                 BF m0 /m1 (x0 ) =                               ×
                                                   ˆ
                                                   P(M = m1 |x0 ) π(M = m0 )
                                                   ♯{mi∗ = m0 } π(M = m1 )
                                                               ×
                                          =                                ,
                                                   ♯{mi∗ = m1 } π(M = m0 )

      replaced with

                                             1 + ♯{mi∗ = m0 } π(M = m1 )
                  BF m0 /m1 (x0 ) =                          ×
                                             1 + ♯{mi∗ = m1 } π(M = m0 )

      to avoid indeterminacy (also Bayes estimate).
Approximative Bayesian Computation (ABC) Methods
  ABC for model choice in GRFs
     Illustrations



Toy example

      iid Bernoulli model versus two-state first-order Markov chain, i.e.
                                                       n
                                                                      {1 + exp(θ0 )}n ,
                     f0 (x|θ0 ) = exp θ0                   I{xi =1}
                                                   i=1

      versus
                                                   n
                                 1
                                                                       {1 + exp(θ1 )}n−1 ,
               f1 (x|θ1 ) =        exp θ1              I{xi =xi−1 }
                                 2
                                               i=2

      with priors θ0 ∼ U(−5, 5) and θ1 ∼ U(0, 6) (inspired by “phase
      transition” boundaries).
Approximative Bayesian Computation (ABC) Methods
  ABC for model choice in GRFs
     Illustrations



Toy example (2)




                                                             10
                            5




                                                             5
                     BF01




                                                      BF01

                                                             0
                            0
                      ^




                                                       ^

                                                             −10 −5
                            −5




                                 −40   −20     0 10                   −40   −20     0 10

                                        BF01                                 BF01


      (left) Comparison of the true BF m0 /m1 (x0 ) with BF m0 /m1 (x0 )
      (in logs) over 2, 000 simulations and 4.106 proposals from the
      prior. (right) Same when using tolerance ǫ corresponding to the
      1% quantile on the distances.
Approximative Bayesian Computation (ABC) Methods
  ABC for model choice in GRFs
     Illustrations



Protein folding




      Superposition of the native structure (grey) with the ST1
      structure (red.), the ST2 structure (orange), the ST3 structure
      (green), and the DT structure (blue).
Approximative Bayesian Computation (ABC) Methods
  ABC for model choice in GRFs
     Illustrations



Protein folding (2)


                                        % seq . Id.   TM-score   FROST score
                 1i5nA (ST1)                32         0.86         75.3
                 1ls1A1 (ST2)               5          0.42         8.9
                 1jr8A (ST3)                4          0.24         8.9
                 1s7oA (DT)                 10         0.08         7.8

      Characteristics of dataset. % seq. Id.: percentage of identity with
      the query sequence. TM-score.: similarity between predicted and
      native structure (uncertainty between 0.17 and 0.4) FROST score:
      quality of alignment of the query onto the candidate structure
      (uncertainty between 7 and 9).
Approximative Bayesian Computation (ABC) Methods
  ABC for model choice in GRFs
     Illustrations



Protein folding (3)



                                      NS/ST1       NS/ST2   NS/ST3   NS/DT
               BF                       1.34         1.22     2.42     2.76
           P(M = NS|x0 )               0.573        0.551    0.708    0.734

      Estimates of the Bayes factors between model NS and models
      ST1, ST2, ST3, and DT, and corresponding posterior
      probabilities of model NS based on an ABC-MC algorithm using
      1.2 106 simulations and a tolerance ǫ equal to the 1% quantile of
      the distances.

Contenu connexe

Tendances

Learning Sparse Representation
Learning Sparse RepresentationLearning Sparse Representation
Learning Sparse RepresentationGabriel Peyré
 
Geodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGeodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGabriel Peyré
 
Divergence center-based clustering and their applications
Divergence center-based clustering and their applicationsDivergence center-based clustering and their applications
Divergence center-based clustering and their applicationsFrank Nielsen
 
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...Gabriel Peyré
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clusteringFrank Nielsen
 
Classification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsClassification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsFrank Nielsen
 
Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)Frank Nielsen
 
Comparing estimation algorithms for block clustering models
Comparing estimation algorithms for block clustering modelsComparing estimation algorithms for block clustering models
Comparing estimation algorithms for block clustering modelsBigMC
 
Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems
Low Complexity Regularization of Inverse Problems - Course #1 Inverse ProblemsLow Complexity Regularization of Inverse Problems - Course #1 Inverse Problems
Low Complexity Regularization of Inverse Problems - Course #1 Inverse ProblemsGabriel Peyré
 
Mesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionMesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionGabriel Peyré
 
Jyokyo-kai-20120605
Jyokyo-kai-20120605Jyokyo-kai-20120605
Jyokyo-kai-20120605ketanaka
 
Reflect tsukuba524
Reflect tsukuba524Reflect tsukuba524
Reflect tsukuba524kazuhase2011
 
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...Gabriel Peyré
 
Adaptive Signal and Image Processing
Adaptive Signal and Image ProcessingAdaptive Signal and Image Processing
Adaptive Signal and Image ProcessingGabriel Peyré
 
Engr 371 final exam april 2010
Engr 371 final exam april 2010Engr 371 final exam april 2010
Engr 371 final exam april 2010amnesiann
 
Engr 371 final exam april 2006
Engr 371 final exam april 2006Engr 371 final exam april 2006
Engr 371 final exam april 2006amnesiann
 
Model Selection with Piecewise Regular Gauges
Model Selection with Piecewise Regular GaugesModel Selection with Piecewise Regular Gauges
Model Selection with Piecewise Regular GaugesGabriel Peyré
 

Tendances (20)

Learning Sparse Representation
Learning Sparse RepresentationLearning Sparse Representation
Learning Sparse Representation
 
Geodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGeodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and Graphics
 
Divergence center-based clustering and their applications
Divergence center-based clustering and their applicationsDivergence center-based clustering and their applications
Divergence center-based clustering and their applications
 
YSC 2013
YSC 2013YSC 2013
YSC 2013
 
Athens workshop on MCMC
Athens workshop on MCMCAthens workshop on MCMC
Athens workshop on MCMC
 
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
Low Complexity Regularization of Inverse Problems - Course #3 Proximal Splitt...
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clustering
 
Classification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsClassification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metrics
 
Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)Computational Information Geometry: A quick review (ICMS)
Computational Information Geometry: A quick review (ICMS)
 
Comparing estimation algorithms for block clustering models
Comparing estimation algorithms for block clustering modelsComparing estimation algorithms for block clustering models
Comparing estimation algorithms for block clustering models
 
Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems
Low Complexity Regularization of Inverse Problems - Course #1 Inverse ProblemsLow Complexity Regularization of Inverse Problems - Course #1 Inverse Problems
Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems
 
Mesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionMesh Processing Course : Multiresolution
Mesh Processing Course : Multiresolution
 
Jyokyo-kai-20120605
Jyokyo-kai-20120605Jyokyo-kai-20120605
Jyokyo-kai-20120605
 
Reflect tsukuba524
Reflect tsukuba524Reflect tsukuba524
Reflect tsukuba524
 
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
Low Complexity Regularization of Inverse Problems - Course #2 Recovery Guaran...
 
Adaptive Signal and Image Processing
Adaptive Signal and Image ProcessingAdaptive Signal and Image Processing
Adaptive Signal and Image Processing
 
Engr 371 final exam april 2010
Engr 371 final exam april 2010Engr 371 final exam april 2010
Engr 371 final exam april 2010
 
Engr 371 final exam april 2006
Engr 371 final exam april 2006Engr 371 final exam april 2006
Engr 371 final exam april 2006
 
ABC in Roma
ABC in RomaABC in Roma
ABC in Roma
 
Model Selection with Piecewise Regular Gauges
Model Selection with Piecewise Regular GaugesModel Selection with Piecewise Regular Gauges
Model Selection with Piecewise Regular Gauges
 

Similaire à Montpellier Math Colloquium

Xian's abc
Xian's abcXian's abc
Xian's abcDeb Roy
 
Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration Jagadeeswaran Rathinavel
 
Triangle counting handout
Triangle counting handoutTriangle counting handout
Triangle counting handoutcsedays
 
Omiros' talk on the Bernoulli factory problem
Omiros' talk on the  Bernoulli factory problemOmiros' talk on the  Bernoulli factory problem
Omiros' talk on the Bernoulli factory problemBigMC
 
Regression Theory
Regression TheoryRegression Theory
Regression TheorySSA KPI
 
Talk at CIRM on Poisson equation and debiasing techniques
Talk at CIRM on Poisson equation and debiasing techniquesTalk at CIRM on Poisson equation and debiasing techniques
Talk at CIRM on Poisson equation and debiasing techniquesPierre Jacob
 
Convergence of ABC methods
Convergence of ABC methodsConvergence of ABC methods
Convergence of ABC methodsChristian Robert
 
On Clustering Histograms with k-Means by Using Mixed α-Divergences
 On Clustering Histograms with k-Means by Using Mixed α-Divergences On Clustering Histograms with k-Means by Using Mixed α-Divergences
On Clustering Histograms with k-Means by Using Mixed α-DivergencesFrank Nielsen
 
Can we estimate a constant?
Can we estimate a constant?Can we estimate a constant?
Can we estimate a constant?Christian Robert
 
Unbiased Hamiltonian Monte Carlo
Unbiased Hamiltonian Monte CarloUnbiased Hamiltonian Monte Carlo
Unbiased Hamiltonian Monte CarloJeremyHeng10
 
Let's Practice What We Preach: Likelihood Methods for Monte Carlo Data
Let's Practice What We Preach: Likelihood Methods for Monte Carlo DataLet's Practice What We Preach: Likelihood Methods for Monte Carlo Data
Let's Practice What We Preach: Likelihood Methods for Monte Carlo DataChristian Robert
 
Non-sampling functional approximation of linear and non-linear Bayesian Update
Non-sampling functional approximation of linear and non-linear Bayesian UpdateNon-sampling functional approximation of linear and non-linear Bayesian Update
Non-sampling functional approximation of linear and non-linear Bayesian UpdateAlexander Litvinenko
 
How to find a cheap surrogate to approximate Bayesian Update Formula and to a...
How to find a cheap surrogate to approximate Bayesian Update Formula and to a...How to find a cheap surrogate to approximate Bayesian Update Formula and to a...
How to find a cheap surrogate to approximate Bayesian Update Formula and to a...Alexander Litvinenko
 
TR tabling presentation_2010_09
TR tabling presentation_2010_09TR tabling presentation_2010_09
TR tabling presentation_2010_09Paul Fodor
 
ABC with data cloning for MLE in state space models
ABC with data cloning for MLE in state space modelsABC with data cloning for MLE in state space models
ABC with data cloning for MLE in state space modelsUmberto Picchini
 

Similaire à Montpellier Math Colloquium (20)

Xian's abc
Xian's abcXian's abc
Xian's abc
 
Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration
 
Triangle counting handout
Triangle counting handoutTriangle counting handout
Triangle counting handout
 
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
 
Omiros' talk on the Bernoulli factory problem
Omiros' talk on the  Bernoulli factory problemOmiros' talk on the  Bernoulli factory problem
Omiros' talk on the Bernoulli factory problem
 
Regression Theory
Regression TheoryRegression Theory
Regression Theory
 
Talk at CIRM on Poisson equation and debiasing techniques
Talk at CIRM on Poisson equation and debiasing techniquesTalk at CIRM on Poisson equation and debiasing techniques
Talk at CIRM on Poisson equation and debiasing techniques
 
Shanghai tutorial
Shanghai tutorialShanghai tutorial
Shanghai tutorial
 
Convergence of ABC methods
Convergence of ABC methodsConvergence of ABC methods
Convergence of ABC methods
 
On Clustering Histograms with k-Means by Using Mixed α-Divergences
 On Clustering Histograms with k-Means by Using Mixed α-Divergences On Clustering Histograms with k-Means by Using Mixed α-Divergences
On Clustering Histograms with k-Means by Using Mixed α-Divergences
 
Can we estimate a constant?
Can we estimate a constant?Can we estimate a constant?
Can we estimate a constant?
 
Unbiased Hamiltonian Monte Carlo
Unbiased Hamiltonian Monte CarloUnbiased Hamiltonian Monte Carlo
Unbiased Hamiltonian Monte Carlo
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
cswiercz-general-presentation
cswiercz-general-presentationcswiercz-general-presentation
cswiercz-general-presentation
 
Let's Practice What We Preach: Likelihood Methods for Monte Carlo Data
Let's Practice What We Preach: Likelihood Methods for Monte Carlo DataLet's Practice What We Preach: Likelihood Methods for Monte Carlo Data
Let's Practice What We Preach: Likelihood Methods for Monte Carlo Data
 
Non-sampling functional approximation of linear and non-linear Bayesian Update
Non-sampling functional approximation of linear and non-linear Bayesian UpdateNon-sampling functional approximation of linear and non-linear Bayesian Update
Non-sampling functional approximation of linear and non-linear Bayesian Update
 
How to find a cheap surrogate to approximate Bayesian Update Formula and to a...
How to find a cheap surrogate to approximate Bayesian Update Formula and to a...How to find a cheap surrogate to approximate Bayesian Update Formula and to a...
How to find a cheap surrogate to approximate Bayesian Update Formula and to a...
 
TR tabling presentation_2010_09
TR tabling presentation_2010_09TR tabling presentation_2010_09
TR tabling presentation_2010_09
 
ABC with data cloning for MLE in state space models
ABC with data cloning for MLE in state space modelsABC with data cloning for MLE in state space models
ABC with data cloning for MLE in state space models
 
16 fft
16 fft16 fft
16 fft
 

Plus de Christian Robert

Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceChristian Robert
 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinChristian Robert
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?Christian Robert
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Christian Robert
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Christian Robert
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking componentsChristian Robert
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihoodChristian Robert
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)Christian Robert
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerChristian Robert
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Christian Robert
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussionChristian Robert
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceChristian Robert
 

Plus de Christian Robert (20)

Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de France
 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael Martin
 
discussion of ICML23.pdf
discussion of ICML23.pdfdiscussion of ICML23.pdf
discussion of ICML23.pdf
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?
 
restore.pdf
restore.pdfrestore.pdf
restore.pdf
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking components
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihood
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like sampler
 
eugenics and statistics
eugenics and statisticseugenics and statistics
eugenics and statistics
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
asymptotics of ABC
asymptotics of ABCasymptotics of ABC
asymptotics of ABC
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussion
 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
 

Dernier

fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingTeacherCyreneCayanan
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfchloefrazer622
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfagholdier
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactPECB
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 
9548086042 for call girls in Indira Nagar with room service
9548086042  for call girls in Indira Nagar  with room service9548086042  for call girls in Indira Nagar  with room service
9548086042 for call girls in Indira Nagar with room servicediscovermytutordmt
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfAyushMahapatra5
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfchloefrazer622
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAssociation for Project Management
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Krashi Coaching
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdfSoniaTolstoy
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDThiyagu K
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfJayanti Pande
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Disha Kariya
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Celine George
 

Dernier (20)

fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writing
 
Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdf
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
9548086042 for call girls in Indira Nagar with room service
9548086042  for call girls in Indira Nagar  with room service9548086042  for call girls in Indira Nagar  with room service
9548086042 for call girls in Indira Nagar with room service
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdf
 
APM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across SectorsAPM Welcome, APM North West Network Conference, Synergies Across Sectors
APM Welcome, APM North West Network Conference, Synergies Across Sectors
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 

Montpellier Math Colloquium

  • 1. Approximative Bayesian Computation (ABC) Methods Approximative Bayesian Computation (ABC) Methods Christian P. Robert Universit´ Paris Dauphine and CREST-INSEE e http://www.ceremade.dauphine.fr/~xian Joint works with M. Beaumont, J.-M. Cornuet, A. Grelaud, J.-M. Marin, F. Rodolphe, & J.-F. Tally Colloquium, Universit´ de Montpellier 2, 12 f´vrier 2009 e e
  • 2. Approximative Bayesian Computation (ABC) Methods Outline Introduction 1 Population Monte Carlo 2 ABC 3 ABC-PMC 4 ABC for model choice in GRFs 5
  • 3. Approximative Bayesian Computation (ABC) Methods Introduction General purpose Given a density π known up to a normalizing constant, and an integrable function h, compute h(x)˜ (x)µ(dx) π Π(h) = h(x)π(x)µ(dx) = π (x)µ(dx) ˜ when h(x)˜ (x)µ(dx) is intractable. π
  • 4. Approximative Bayesian Computation (ABC) Methods Introduction Monte Carlo basics Monte Carlo basics Generate an iid sample x1 , . . . , xN from π and estimate Π(h) by N ˆN ΠM C (h) = N −1 h(xi ). i=1 as ˆN LLN: ΠM C (h) −→ Π(h) If Π(h2 ) = h2 (x)π(x)µ(dx) < ∞, √ L ˆN N ΠM C (h) − Π(h) N 0, Π [h − Π(h)]2 . CLT: Caveat Often impossible or inefficient to simulate directly from Π
  • 5. Approximative Bayesian Computation (ABC) Methods Introduction Importance Sampling Importance Sampling For Q proposal distribution such that Q(dx) = q(x)µ(dx), alternative representation Π(h) = h(x){π/q}(x)q(x)µ(dx). Principle Generate an iid sample x1 , . . . , xN ∼ Q and estimate Π(h) by N ˆ IS ΠQ,N (h) = N −1 h(xi ){π/q}(xi ). i=1
  • 6. Approximative Bayesian Computation (ABC) Methods Introduction Importance Sampling Then as ˆ LLN: ΠIS (h) −→ Π(h) and if Q((hπ/q)2 ) < ∞, Q,N √ L ˆ Q,N N (ΠIS (h) − Π(h)) N 0, Q{(hπ/q − Π(h))2 } . CLT : Caveat ˆ Q,N If normalizing constant unknown, impossible to use ΠIS Generic problem in Bayesian Statistics: π(θ|x) ∝ f (x|θ)π(θ).
  • 7. Approximative Bayesian Computation (ABC) Methods Introduction Importance Sampling Self-Normalised Importance Sampling Self normalized version −1 N N ˆ Q,N ΠSN IS (h) {π/q}(xi ) = h(xi ){π/q}(xi ). i=1 i=1 as ˆ ΠSN IS (h) −→ Π(h) LLN : Q,N and if Π((1 + h2 )(π/q)) < ∞, √ L 0, π {(π/q)(h − Π(h)}2 ) . ˆ Q,N N (ΠSN IS (h) − Π(h)) CLT : N c The quality of the SNIS approximation depends on the choice of Q
  • 8. Approximative Bayesian Computation (ABC) Methods Introduction Importance Sampling Iterated importance sampling Introduction of an algorithmic temporal dimension : (t) (t−1) xi ∼ qt (x|xi ) i = 1, . . . , n, t = 1, . . . and n 1 (t) (t) ˆ It = ̺i h(xi ) n i=1 is still unbiased for (t) πt (xi ) (t) ̺i = , i = 1, . . . , n (t) (t−1) qt (xi |xi )
  • 9. Approximative Bayesian Computation (ABC) Methods Population Monte Carlo PMCA: Population Monte Carlo Algorithm At time t = 0 iid Generate (xi,0 )1≤i≤N ∼ Q0 Set ωi,0 = {π/q0 }(xi,0 ) iid Generate (Ji,0 )1≤i≤N ∼ M(1, (¯ i,0 )1≤i≤N ) ω Set xi,0 = xJi ,0 ˜ At time t (t = 1, . . . , T ), ind Generate xi,t ∼ Qi,t (˜i,t−1 , ·) x Set ωi,t = {π(xi,t )/qi,t (˜i,t−1 , xi,t )} x iid Generate (Ji,t )1≤i≤N ∼ M(1, (¯ i,t )1≤i≤N ) ω Set xi,t = xJi,t ,t . ˜ [Capp´, Douc, Guillin, Marin, & CPR, 2009] e
  • 10. Approximative Bayesian Computation (ABC) Methods Population Monte Carlo Notes on PMC After T iterations of PMCA, PMC estimator of Π(h) given by T N 1 ¯ N,T ΠP M C (h) = ¯ ωi,t h(xi,t ). T t=1 i=1 ¯ ωi,t means normalising over whole sequence of simulations 1 Qi,t ’s chosen arbitrarily under support constraint 2 Qi,t ’s may depend on whole sequence of simulations 3 alternatives to multinomial sampling reduce variance/preserve 4 “unbiasedness” [Kitagawa, 1996 / Carpenter, Clifford & Fearnhead, 1997]
  • 11. Approximative Bayesian Computation (ABC) Methods ABC The ABC method Bayesian setting: target is π(θ)f (x|θ) When likelihood f (x|θ) not in closed form, likelihood-free rejection technique: ABC algorithm For an observation y ∼ f (y|θ), under the prior π(θ), keep jointly simulating θ′ ∼ π(θ) , x ∼ f (x|θ′ ) , until the auxiliary variable x is equal to the observed value, x = y. [Pritchard et al., 1999]
  • 12. Approximative Bayesian Computation (ABC) Methods ABC Population genetics example Tree of ancestors in a sample of genes
  • 13. Approximative Bayesian Computation (ABC) Methods ABC A as approximative When y is a continuous random variable, equality x = y is replaced with a tolerance condition, ̺(x, y) ≤ ǫ where ̺ is a distance between summary statistics Output distributed from π(θ) Pθ {̺(x, y) < ǫ} ∝ π(θ|̺(x, y) < ǫ)
  • 14. Approximative Bayesian Computation (ABC) Methods ABC ABC improvements Simulating from the prior is often poor in efficiency Either modify the proposal distribution on θ to increase the density of x’s within the vicinity of y... [Marjoram et al, 2003; Bortot et al., 2007, Sisson et al., 2007] ...or by viewing the problem as a conditional density estimation and by developing techniques to allow for larger ǫ [Beaumont et al., 2002]
  • 15. Approximative Bayesian Computation (ABC) Methods ABC ABC-MCMC Markov chain (θ(t) ) created via the transition function  θ′ ∼ K(θ′ |θ(t) ) if x ∼ f (x|θ′ ) is such that x = y   π(θ′ )K(θ(t) |θ′ ) (t+1) and u ∼ U(0, 1) ≤ π(θ(t) )K(θ′ |θ(t) ) , θ =   (t) θ otherwise, has the posterior π(θ|y) as stationary distribution [Marjoram et al, 2003]
  • 16. Approximative Bayesian Computation (ABC) Methods ABC ABC-PRC Another sequential version producing a sequence of Markov (t) (t) transition kernels Kt and of samples (θ1 , . . . , θN ) (1 ≤ t ≤ T ) ABC-PRC Algorithm (t−1) Pick a θ⋆ is selected at random among the previous θi ’s 1 (t−1) (1 ≤ i ≤ N ). with probabilities ωi Generate 2 (t) (t) θi ∼ Kt (θ|θ⋆ ) , x ∼ f (x|θi ) , Check that ̺(x, y) < ǫ, otherwise start again. 3 [Sisson et al., 2007]
  • 17. Approximative Bayesian Computation (ABC) Methods ABC ABC-PRC weight (t) Probability ωi computed as (t) (t) (t) (t) ωi ∝ π(θi )Lt−1 (θ⋆ |θi ){π(θ⋆ )Kt (θi |θ⋆ )}−1 , where Lt−1 is an arbitrary transition kernel. In case Lt−1 (θ′ |θ) = Kt (θ|θ′ ) , all weights are equal under a uniform prior. Inspired from Del Moral et al. (2006), who use backward kernels Lt−1 in SMC to achieve unbiasedness
  • 18. Approximative Bayesian Computation (ABC) Methods ABC ABC-PRC bias Lack of unbiasedness of the method Joint density of the accepted pair (θ(t−1) , θ(t) ) proportional to (t−1) (t) (t−1) (t) |y)Kt (θ |θ )f (y|θ ), π(θ For an arbitrary function h(θ), E[ωt h(θ(t) )] proportional to π(θ (t) )Lt−1 (θ (t−1) |θ (t) ) ZZ (t) (t−1) (t) (t−1) (t) (t−1) (t) ) |y)Kt (θ |θ )f (y|θ )dθ dθ h(θ π(θ π(θ (t−1) )Kt (θ (t) |θ (t−1) ) π(θ (t) )Lt−1 (θ (t−1) |θ (t) ) ZZ (t) (t−1) (t−1) ∝ ) )f (y|θ ) h(θ π(θ π(θ (t−1) )Kt (θ (t) |θ (t−1) ) (t) (t−1) (t) (t−1) (t) × Kt (θ |θ )f (y|θ )dθ dθ Z ff Z (t) (t) (t−1) (t) (t−1) (t−1) (t) ∝ )π(θ |y) Lt−1 (θ |θ )f (y|θ )dθ dθ h(θ .
  • 19. Approximative Bayesian Computation (ABC) Methods ABC A mixture example 0.8 0.8 0.8 0.8 0.8 0.4 0.4 0.4 0.4 0.4 0.0 0.0 0.0 0.0 0.0 −3 −1 123 −3 −1 123 −3 −1 123 −3 −1 123 −3 −1 123 θ θ θ θ θ 0.8 0.8 0.8 0.8 0.8 0.4 0.4 0.4 0.4 0.4 0.0 0.0 0.0 0.0 0.0 −3 −1 123 −3 −1 123 −3 −1 123 −3 −1 123 −3 −1 123 θ θ θ θ θ Comparison of τ = 0.15 and τ = 1/0.15 in Kt
  • 20. Approximative Bayesian Computation (ABC) Methods ABC-PMC A PMC version Use of the same kernel idea as ABC-PRC but with IS correction Generate a sample at iteration t by N (t−1) (t−1) πt (θ(t) ) ∝ Kt (θ(t) |θj ˆ ωj ) j=1 modulo acceptance of the associated xt , and use an importance (t) weight associated with an accepted simulation θi (t) (t) (t) ωi ∝ π(θi ) πt (θi ) . ˆ c Still likelihood free [Beaumont et al., 2008, arXiv:0805.2256]
  • 21. Approximative Bayesian Computation (ABC) Methods ABC-PMC The ABC-PMC algorithm Given a decreasing sequence of approximation levels ǫ1 ≥ . . . ≥ ǫT , 1. At iteration t = 1, For i = 1, ..., N (1) (1) Simulate θi ∼ π(θ) and x ∼ f (x|θi ) until ̺(x, y) < ǫ1 (1) Set ωi = 1/N (1) Take τ 2 as twice the empirical variance of the θi ’s 2. At iteration 2 ≤ t ≤ T , For i = 1, ..., N , repeat (t−1) (t−1) ⋆ Pick θi from the θj ’s with probabilities ωj (t) (t) 2 ⋆ ⋆ generate θi |θi ∼ N (θi , σt ) and x ∼ f (x|θi ) until ̺(x, y) < ǫt (t) (t) (t−1) (t) (t−1) N Set ωi ∝ π(θi )/ θi − θj ωj ϕ σt ) −1 j=1 (t) 2 Take τt+1 as twice the weighted empirical variance of the θi ’s
  • 22. Approximative Bayesian Computation (ABC) Methods ABC-PMC A mixture example (0) Toy model of Sisson et al. (2007): if θ ∼ U(−10, 10) , x|θ ∼ 0.5 N (θ, 1) + 0.5 N (θ, 1/100) , then the posterior distribution associated with y = 0 is the normal mixture θ|y = 0 ∼ 0.5 N (0, 1) + 0.5 N (0, 1/100) restricted to [−10, 10]. Furthermore, true target available as π(θ||x| < ǫ) ∝ Φ(ǫ−θ)−Φ(−ǫ−θ)+Φ(10(ǫ−θ))−Φ(−10(ǫ+θ)) .
  • 23. Approximative Bayesian Computation (ABC) Methods ABC-PMC A mixture example (2) Recovery of the target, whether using a fixed standard deviation of τ = 0.15 or τ = 1/0.15, or a sequence of adaptive τt ’s.
  • 24. Approximative Bayesian Computation (ABC) Methods ABC for model choice in GRFs ABC for model choice Introduction 1 Population Monte Carlo 2 ABC 3 ABC-PMC 4 ABC for model choice in GRFs 5 Gibbs random fields Model choice via ABC Illustrations
  • 25. Approximative Bayesian Computation (ABC) Methods ABC for model choice in GRFs Gibbs random fields Gibbs random fields Gibbs distribution y = (y1 , . . . , yn ) is a Gibbs random field associated with the graph G if 1 f (y) = exp − Vc (yc ) . Z c∈C where Z is the normalising constant, C is the set of cliques and Vc is any function also called potential (and U (y) = c∈C Vc (yc ) is the energy function) c Z is usually unavailable in closed form
  • 26. Approximative Bayesian Computation (ABC) Methods ABC for model choice in GRFs Gibbs random fields Potts model Potts model Vc (y) is of the form Vc (y) = θS(y) =θ δyl =yi l∼i where l∼i denotes a neighbourhood structure In most realistic settings, summation exp{θT S(x)} Zθ = x∈X involves too many terms to be manageable and numerical approximations cannot always be trusted [Cucala, Marin, CPR & Titterington, 2009]
  • 27. Approximative Bayesian Computation (ABC) Methods ABC for model choice in GRFs Model choice via ABC Bayesian Model Choice Comparing a model with potential S0 taking values in Rp0 versus a model with potential S1 taking values in Rp1 through the Bayes factor corresponding to the priors π0 and π1 on each parameter space T Bm0 /m1 (x) = exp{θ0 S0 (x)}/Zθ0 ,0 π0 (dθ0 ) T exp{θ1 S1 (x)}/Zθ1 ,1 π1 (dθ1 ) Use of Jeffreys’ scale to select most appropriate model
  • 28. Approximative Bayesian Computation (ABC) Methods ABC for model choice in GRFs Model choice via ABC Neighbourhood relations Choice to be made between M neighbourhood relations m i ∼ i′ (0 ≤ m ≤ M − 1) with Sm (x) = I{xi =xi′ } m i∼i′ driven by the posterior probabilities of the models.
  • 29. Approximative Bayesian Computation (ABC) Methods ABC for model choice in GRFs Model choice via ABC Model index Formalisation via a model index M that appears as a new parameter with prior distribution π(M = m) and π(θ|M = m) = πm (θm ) Computational target: P(M = m|x) ∝ fm (x|θm )πm (θm ) dθm π(M = m) , Θm
  • 30. Approximative Bayesian Computation (ABC) Methods ABC for model choice in GRFs Model choice via ABC Sufficient statistics By definition, if S(x) sufficient statistic for the joint parameters (M, θ0 , . . . , θM −1 ), P(M = m|x) = P(M = m|S(x)) . For each model m, own sufficient statistic Sm (·) and S(·) = (S0 (·), . . . , SM −1 (·)) also sufficient. For Gibbs random fields, 1 2 x|M = m ∼ fm (x|θm ) = fm (x|S(x))fm (S(x)|θm ) 1 f 2 (S(x)|θm ) = n(S(x)) m where n(S(x)) = ♯ {˜ ∈ X : S(˜ ) = S(x)} x x c S(x) is therefore also sufficient for the joint parameters [Specific to Gibbs random fields!]
  • 31. Approximative Bayesian Computation (ABC) Methods ABC for model choice in GRFs Model choice via ABC ABC model choice Algorithm ABC-MC Generate m∗ from the prior π(M = m). ∗ Generate θm∗ from the prior πm∗ (·). Generate x∗ from the model fm∗ (·|θm∗ ). ∗ Compute the distance ρ(S(x0 ), S(x∗ )). Accept (θm∗ , m∗ ) if ρ(S(x0 ), S(x∗ )) < ǫ. ∗ Note When ǫ = 0 the algorithm is exact
  • 32. Approximative Bayesian Computation (ABC) Methods ABC for model choice in GRFs Model choice via ABC ABC approximation to the Bayes factor Frequency ratio: ˆ P(M = m0 |x0 ) π(M = m1 ) BF m0 /m1 (x0 ) = × ˆ P(M = m1 |x0 ) π(M = m0 ) ♯{mi∗ = m0 } π(M = m1 ) × = , ♯{mi∗ = m1 } π(M = m0 ) replaced with 1 + ♯{mi∗ = m0 } π(M = m1 ) BF m0 /m1 (x0 ) = × 1 + ♯{mi∗ = m1 } π(M = m0 ) to avoid indeterminacy (also Bayes estimate).
  • 33. Approximative Bayesian Computation (ABC) Methods ABC for model choice in GRFs Illustrations Toy example iid Bernoulli model versus two-state first-order Markov chain, i.e. n {1 + exp(θ0 )}n , f0 (x|θ0 ) = exp θ0 I{xi =1} i=1 versus n 1 {1 + exp(θ1 )}n−1 , f1 (x|θ1 ) = exp θ1 I{xi =xi−1 } 2 i=2 with priors θ0 ∼ U(−5, 5) and θ1 ∼ U(0, 6) (inspired by “phase transition” boundaries).
  • 34. Approximative Bayesian Computation (ABC) Methods ABC for model choice in GRFs Illustrations Toy example (2) 10 5 5 BF01 BF01 0 0 ^ ^ −10 −5 −5 −40 −20 0 10 −40 −20 0 10 BF01 BF01 (left) Comparison of the true BF m0 /m1 (x0 ) with BF m0 /m1 (x0 ) (in logs) over 2, 000 simulations and 4.106 proposals from the prior. (right) Same when using tolerance ǫ corresponding to the 1% quantile on the distances.
  • 35. Approximative Bayesian Computation (ABC) Methods ABC for model choice in GRFs Illustrations Protein folding Superposition of the native structure (grey) with the ST1 structure (red.), the ST2 structure (orange), the ST3 structure (green), and the DT structure (blue).
  • 36. Approximative Bayesian Computation (ABC) Methods ABC for model choice in GRFs Illustrations Protein folding (2) % seq . Id. TM-score FROST score 1i5nA (ST1) 32 0.86 75.3 1ls1A1 (ST2) 5 0.42 8.9 1jr8A (ST3) 4 0.24 8.9 1s7oA (DT) 10 0.08 7.8 Characteristics of dataset. % seq. Id.: percentage of identity with the query sequence. TM-score.: similarity between predicted and native structure (uncertainty between 0.17 and 0.4) FROST score: quality of alignment of the query onto the candidate structure (uncertainty between 7 and 9).
  • 37. Approximative Bayesian Computation (ABC) Methods ABC for model choice in GRFs Illustrations Protein folding (3) NS/ST1 NS/ST2 NS/ST3 NS/DT BF 1.34 1.22 2.42 2.76 P(M = NS|x0 ) 0.573 0.551 0.708 0.734 Estimates of the Bayes factors between model NS and models ST1, ST2, ST3, and DT, and corresponding posterior probabilities of model NS based on an ABC-MC algorithm using 1.2 106 simulations and a tolerance ǫ equal to the 1% quantile of the distances.