SlideShare une entreprise Scribd logo
1  sur  15
Télécharger pour lire hors ligne
Adaptive MCMH for Bayesian inference of spatial
            autologistic models

          discussion by Pierre E. Jacob & Christian P. Robert

                            Universit´ Paris-Dauphine and CREST
                                     e
                           http://statisfaction.wordpress.com


                  Adap’skiii, Park City, Utah, Jan. 03, 2011




                                                      Adaptive MCMH for Bayesian inference of spatial autologistic mo
discussion by Pierre E. Jacob & Christian P. Robert   1/ 7
Context: untractable likelihoods



        Cases when the likelihood function f (y|θ) is unavailable and
        when the completion step

                                       f (y|θ) =             f (y, z|θ) dz
                                                         Z

        is impossible or too costly because of the dimension of z




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   2/ 7
Context: untractable likelihoods



        Cases when the likelihood function f (y|θ) is unavailable and
        when the completion step

                                       f (y|θ) =             f (y, z|θ) dz
                                                         Z

        is impossible or too costly because of the dimension of z
         c MCMC cannot be implemented!




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   2/ 7
Context: untractable likelihoods



        Cases when the likelihood function f (y|θ) is unavailable and
        when the completion step

                                       f (y|θ) =             f (y, z|θ) dz
                                                         Z

        is impossible or too costly because of the dimension of z
         c MCMC cannot be implemented!
        =⇒ adaptive MCMH




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   2/ 7
Illustrations




  Potts model: if y takes values on a grid Y of size k n and


                                 f (y|θ) ∝ exp θ                  Iyl =yi
                                                            l∼i

  where l∼i denotes a neighbourhood relation, n moderately large
  prohibits the computation of the normalising constant




                                                          Adaptive MCMH for Bayesian inference of spatial autologistic mo
    discussion by Pierre E. Jacob & Christian P. Robert   3/ 7
Compare ABC with adaptive MCMH?


  ABC algorithm
  For an observation y ∼ f (y|θ), under the prior π(θ), keep jointly
  simulating
                       θ ∼ π(θ) , z ∼ f (z|θ ) ,
  until the auxiliary variable z is equal to the observed value, z = y.

                                                                           [Tavar´ et al., 1997]
                                                                                 e

  ABC can also be applied when the normalizing constant is
  unknown, hence it could be compared to adaptive MCMH, e.g. for
  a given number of y’s drawn from f , or in terms of execution time.



                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   4/ 7
ˆ
Some questions on the estimator R(θt , ϑ)



  ˆ                                                1
  R(θt , ϑ) =
                   m0 + m0              θi ∈St {θt } I(||θi      − ϑ|| ≤ η)
                                                                                           
                                                         m0       (i)         m0     (t)
                                                              g(zj , θt )        g(zj , θt ) 
  ×                      I(||θi − ϑ|| ≤ η)
                                                                  (i)
                                                                           +
                                                                                     (t)
      
        θi ∈St {θt }                                     j=1   g(zj , ϑ)      j=1 g(zj , ϑ)
                                                                                              


        Why does it bypass the number of repetitions of the θi ’s?




                                                          Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert    5/ 7
ˆ
Some questions on the estimator R(θt , ϑ)



  ˆ                                                1
  R(θt , ϑ) =
                   m0 + m0              θi ∈St {θt } I(||θi      − ϑ|| ≤ η)
                                                                                           
                                                         m0       (i)         m0     (t)
                                                              g(zj , θt )        g(zj , θt ) 
  ×                      I(||θi − ϑ|| ≤ η)
                                                                  (i)
                                                                           +
                                                                                     (t)
      
        θi ∈St {θt }                                     j=1   g(zj , ϑ)      j=1 g(zj , ϑ)
                                                                                              


        Why does it bypass the number of repetitions of the θi ’s?
                                          (i)
        Why resample the yj ’s only to inverse-weight them later?
                                                                                                 (i)
        Why not stick to the original sample and weight the yj ’s
        directly? This would avoid the necessity to choose m0 .



                                                          Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert    5/ 7
Combine adaptations?

  Adaptive MCMH looks great, but it adds new tuning parameters
  compared to standard MCMC!
        It would be nice to be able to adapt the proposal Q(θt , ϑ),
        e.g. to control the acceptance rate, as in other adaptive
        MCMC algorithms.




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   6/ 7
Combine adaptations?

  Adaptive MCMH looks great, but it adds new tuning parameters
  compared to standard MCMC!
        It would be nice to be able to adapt the proposal Q(θt , ϑ),
        e.g. to control the acceptance rate, as in other adaptive
        MCMC algorithms.
        Then could η be adaptive, since it does as well depend on the
        (unknown) scale of the posterior distribution of θ ?




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   6/ 7
Combine adaptations?

  Adaptive MCMH looks great, but it adds new tuning parameters
  compared to standard MCMC!
        It would be nice to be able to adapt the proposal Q(θt , ϑ),
        e.g. to control the acceptance rate, as in other adaptive
        MCMC algorithms.
        Then could η be adaptive, since it does as well depend on the
        (unknown) scale of the posterior distribution of θ ?
        How would the current theoretical framework cope with these
        additional adaptations?




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   6/ 7
Combine adaptations?

  Adaptive MCMH looks great, but it adds new tuning parameters
  compared to standard MCMC!
        It would be nice to be able to adapt the proposal Q(θt , ϑ),
        e.g. to control the acceptance rate, as in other adaptive
        MCMC algorithms.
        Then could η be adaptive, since it does as well depend on the
        (unknown) scale of the posterior distribution of θ ?
        How would the current theoretical framework cope with these
        additional adaptations?
        In the same spirit, could m (and m0 ) be considered as
        algorithmic parameters and be adapted as well? What
        criterion would be used to adapt them?


                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   6/ 7
Sample size and acceptance rate


        Another MH algorithm where the “true” acceptance ratio is
        replaced by a ratio with an unbiased estimator is the Particle
        MCMC algorithm.




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   7/ 7
Sample size and acceptance rate


        Another MH algorithm where the “true” acceptance ratio is
        replaced by a ratio with an unbiased estimator is the Particle
        MCMC algorithm.
        In PMCMC, the acceptance rate is growing with the number
        of particles used to estimate the likelihood, ie when the
        estimation is more precise, the acceptance rate increases and
        converges towards the acceptance rate of an idealized
        standard MH algorithm.




                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   7/ 7
Sample size and acceptance rate


        Another MH algorithm where the “true” acceptance ratio is
        replaced by a ratio with an unbiased estimator is the Particle
        MCMC algorithm.
        In PMCMC, the acceptance rate is growing with the number
        of particles used to estimate the likelihood, ie when the
        estimation is more precise, the acceptance rate increases and
        converges towards the acceptance rate of an idealized
        standard MH algorithm.
        In adaptive MCMH, is there such a link betweeen m, m0 and
        the number of iterations on one side and the acceptance rate
        of the algorithm on the other side? Should it be growing when
        the estimation becomes more and more precise?



                                                         Adaptive MCMH for Bayesian inference of spatial autologistic mo
   discussion by Pierre E. Jacob & Christian P. Robert   7/ 7

Contenu connexe

Tendances

A brief introduction to Hartree-Fock and TDDFT
A brief introduction to Hartree-Fock and TDDFTA brief introduction to Hartree-Fock and TDDFT
A brief introduction to Hartree-Fock and TDDFT
Jiahao Chen
 
Doering Savov
Doering SavovDoering Savov
Doering Savov
gh
 
Differential Geometry
Differential GeometryDifferential Geometry
Differential Geometry
lapuyade
 
Notes on exact and approximate bayesian implementation
Notes on exact and approximate bayesian implementationNotes on exact and approximate bayesian implementation
Notes on exact and approximate bayesian implementation
Leo Vivas
 
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
Leo Vivas
 
Spectral Learning Methods for Finite State Machines with Applications to Na...
  Spectral Learning Methods for Finite State Machines with Applications to Na...  Spectral Learning Methods for Finite State Machines with Applications to Na...
Spectral Learning Methods for Finite State Machines with Applications to Na...
LARCA UPC
 

Tendances (19)

A brief introduction to Hartree-Fock and TDDFT
A brief introduction to Hartree-Fock and TDDFTA brief introduction to Hartree-Fock and TDDFT
A brief introduction to Hartree-Fock and TDDFT
 
Doering Savov
Doering SavovDoering Savov
Doering Savov
 
Differential Geometry
Differential GeometryDifferential Geometry
Differential Geometry
 
The Black-Litterman model in the light of Bayesian portfolio analysis
The Black-Litterman model in the light of Bayesian portfolio analysisThe Black-Litterman model in the light of Bayesian portfolio analysis
The Black-Litterman model in the light of Bayesian portfolio analysis
 
ABC and empirical likelihood
ABC and empirical likelihoodABC and empirical likelihood
ABC and empirical likelihood
 
Approximate Bayesian Computation on GPUs
Approximate Bayesian Computation on GPUsApproximate Bayesian Computation on GPUs
Approximate Bayesian Computation on GPUs
 
Parameter Uncertainty and Learning in Dynamic Financial Decisions
Parameter Uncertainty and Learning in Dynamic Financial DecisionsParameter Uncertainty and Learning in Dynamic Financial Decisions
Parameter Uncertainty and Learning in Dynamic Financial Decisions
 
ABC in Venezia
ABC in VeneziaABC in Venezia
ABC in Venezia
 
New Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial SectorNew Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial Sector
 
Notes on exact and approximate bayesian implementation
Notes on exact and approximate bayesian implementationNotes on exact and approximate bayesian implementation
Notes on exact and approximate bayesian implementation
 
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
Notes on “bayesian implementation” by matthew jackson in econometrica (1991)
 
Mixture Models for Image Analysis
Mixture Models for Image AnalysisMixture Models for Image Analysis
Mixture Models for Image Analysis
 
PAWL - GPU meeting @ Warwick
PAWL - GPU meeting @ WarwickPAWL - GPU meeting @ Warwick
PAWL - GPU meeting @ Warwick
 
Spectral Learning Methods for Finite State Machines with Applications to Na...
  Spectral Learning Methods for Finite State Machines with Applications to Na...  Spectral Learning Methods for Finite State Machines with Applications to Na...
Spectral Learning Methods for Finite State Machines with Applications to Na...
 
Ben Gal
Ben Gal Ben Gal
Ben Gal
 
Is ABC a new empirical Bayes approach?
Is ABC a new empirical Bayes approach?Is ABC a new empirical Bayes approach?
Is ABC a new empirical Bayes approach?
 
Em
EmEm
Em
 
Basic concepts and how to measure price volatility
Basic concepts and how to measure price volatility Basic concepts and how to measure price volatility
Basic concepts and how to measure price volatility
 
smtlectures.1
smtlectures.1smtlectures.1
smtlectures.1
 

Similaire à Discussion of Faming Liang's talk

EM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysisEM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysis
zukun
 
Mcmc & lkd free II
Mcmc & lkd free IIMcmc & lkd free II
Mcmc & lkd free II
Deb Roy
 
Ml mle_bayes
Ml  mle_bayesMl  mle_bayes
Ml mle_bayes
Phong Vo
 
Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02
Deb Roy
 

Similaire à Discussion of Faming Liang's talk (20)

EM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysisEM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysis
 
Lecture_9.pdf
Lecture_9.pdfLecture_9.pdf
Lecture_9.pdf
 
Comparing estimation algorithms for block clustering models
Comparing estimation algorithms for block clustering modelsComparing estimation algorithms for block clustering models
Comparing estimation algorithms for block clustering models
 
Considerate Approaches to ABC Model Selection
Considerate Approaches to ABC Model SelectionConsiderate Approaches to ABC Model Selection
Considerate Approaches to ABC Model Selection
 
ABC with data cloning for MLE in state space models
ABC with data cloning for MLE in state space modelsABC with data cloning for MLE in state space models
ABC with data cloning for MLE in state space models
 
A Tutorial of the EM-algorithm and Its Application to Outlier Detection
A Tutorial of the EM-algorithm and Its Application to Outlier DetectionA Tutorial of the EM-algorithm and Its Application to Outlier Detection
A Tutorial of the EM-algorithm and Its Application to Outlier Detection
 
Mcmc & lkd free II
Mcmc & lkd free IIMcmc & lkd free II
Mcmc & lkd free II
 
Intro to Approximate Bayesian Computation (ABC)
Intro to Approximate Bayesian Computation (ABC)Intro to Approximate Bayesian Computation (ABC)
Intro to Approximate Bayesian Computation (ABC)
 
Ml mle_bayes
Ml  mle_bayesMl  mle_bayes
Ml mle_bayes
 
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
 
Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02
 
Accelerating Pseudo-Marginal MCMC using Gaussian Processes
Accelerating Pseudo-Marginal MCMC using Gaussian ProcessesAccelerating Pseudo-Marginal MCMC using Gaussian Processes
Accelerating Pseudo-Marginal MCMC using Gaussian Processes
 
Presentation.pdf
Presentation.pdfPresentation.pdf
Presentation.pdf
 
Jsm09 talk
Jsm09 talkJsm09 talk
Jsm09 talk
 
Ada boost brown boost performance with noisy data
Ada boost brown boost performance with noisy dataAda boost brown boost performance with noisy data
Ada boost brown boost performance with noisy data
 
2_GLMs_printable.pdf
2_GLMs_printable.pdf2_GLMs_printable.pdf
2_GLMs_printable.pdf
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
 
ABC short course: survey chapter
ABC short course: survey chapterABC short course: survey chapter
ABC short course: survey chapter
 
ABC in London, May 5, 2011
ABC in London, May 5, 2011ABC in London, May 5, 2011
ABC in London, May 5, 2011
 
Jensen's inequality, EM 알고리즘
Jensen's inequality, EM 알고리즘 Jensen's inequality, EM 알고리즘
Jensen's inequality, EM 알고리즘
 

Plus de Christian Robert

Plus de Christian Robert (20)

Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de France
 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael Martin
 
discussion of ICML23.pdf
discussion of ICML23.pdfdiscussion of ICML23.pdf
discussion of ICML23.pdf
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?
 
restore.pdf
restore.pdfrestore.pdf
restore.pdf
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking components
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihood
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like sampler
 
eugenics and statistics
eugenics and statisticseugenics and statistics
eugenics and statistics
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
asymptotics of ABC
asymptotics of ABCasymptotics of ABC
asymptotics of ABC
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussion
 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
 

Dernier

Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
kauryashika82
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
PECB
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
negromaestrong
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
Chris Hunter
 

Dernier (20)

Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-IIFood Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural ResourcesEnergy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
 
Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 

Discussion of Faming Liang's talk

  • 1. Adaptive MCMH for Bayesian inference of spatial autologistic models discussion by Pierre E. Jacob & Christian P. Robert Universit´ Paris-Dauphine and CREST e http://statisfaction.wordpress.com Adap’skiii, Park City, Utah, Jan. 03, 2011 Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 1/ 7
  • 2. Context: untractable likelihoods Cases when the likelihood function f (y|θ) is unavailable and when the completion step f (y|θ) = f (y, z|θ) dz Z is impossible or too costly because of the dimension of z Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 2/ 7
  • 3. Context: untractable likelihoods Cases when the likelihood function f (y|θ) is unavailable and when the completion step f (y|θ) = f (y, z|θ) dz Z is impossible or too costly because of the dimension of z c MCMC cannot be implemented! Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 2/ 7
  • 4. Context: untractable likelihoods Cases when the likelihood function f (y|θ) is unavailable and when the completion step f (y|θ) = f (y, z|θ) dz Z is impossible or too costly because of the dimension of z c MCMC cannot be implemented! =⇒ adaptive MCMH Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 2/ 7
  • 5. Illustrations Potts model: if y takes values on a grid Y of size k n and f (y|θ) ∝ exp θ Iyl =yi l∼i where l∼i denotes a neighbourhood relation, n moderately large prohibits the computation of the normalising constant Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 3/ 7
  • 6. Compare ABC with adaptive MCMH? ABC algorithm For an observation y ∼ f (y|θ), under the prior π(θ), keep jointly simulating θ ∼ π(θ) , z ∼ f (z|θ ) , until the auxiliary variable z is equal to the observed value, z = y. [Tavar´ et al., 1997] e ABC can also be applied when the normalizing constant is unknown, hence it could be compared to adaptive MCMH, e.g. for a given number of y’s drawn from f , or in terms of execution time. Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 4/ 7
  • 7. ˆ Some questions on the estimator R(θt , ϑ) ˆ 1 R(θt , ϑ) = m0 + m0 θi ∈St {θt } I(||θi − ϑ|| ≤ η)     m0 (i) m0 (t)  g(zj , θt ) g(zj , θt )  × I(||θi − ϑ|| ≤ η) (i) + (t)  θi ∈St {θt } j=1 g(zj , ϑ) j=1 g(zj , ϑ)  Why does it bypass the number of repetitions of the θi ’s? Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 5/ 7
  • 8. ˆ Some questions on the estimator R(θt , ϑ) ˆ 1 R(θt , ϑ) = m0 + m0 θi ∈St {θt } I(||θi − ϑ|| ≤ η)     m0 (i) m0 (t)  g(zj , θt ) g(zj , θt )  × I(||θi − ϑ|| ≤ η) (i) + (t)  θi ∈St {θt } j=1 g(zj , ϑ) j=1 g(zj , ϑ)  Why does it bypass the number of repetitions of the θi ’s? (i) Why resample the yj ’s only to inverse-weight them later? (i) Why not stick to the original sample and weight the yj ’s directly? This would avoid the necessity to choose m0 . Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 5/ 7
  • 9. Combine adaptations? Adaptive MCMH looks great, but it adds new tuning parameters compared to standard MCMC! It would be nice to be able to adapt the proposal Q(θt , ϑ), e.g. to control the acceptance rate, as in other adaptive MCMC algorithms. Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 6/ 7
  • 10. Combine adaptations? Adaptive MCMH looks great, but it adds new tuning parameters compared to standard MCMC! It would be nice to be able to adapt the proposal Q(θt , ϑ), e.g. to control the acceptance rate, as in other adaptive MCMC algorithms. Then could η be adaptive, since it does as well depend on the (unknown) scale of the posterior distribution of θ ? Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 6/ 7
  • 11. Combine adaptations? Adaptive MCMH looks great, but it adds new tuning parameters compared to standard MCMC! It would be nice to be able to adapt the proposal Q(θt , ϑ), e.g. to control the acceptance rate, as in other adaptive MCMC algorithms. Then could η be adaptive, since it does as well depend on the (unknown) scale of the posterior distribution of θ ? How would the current theoretical framework cope with these additional adaptations? Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 6/ 7
  • 12. Combine adaptations? Adaptive MCMH looks great, but it adds new tuning parameters compared to standard MCMC! It would be nice to be able to adapt the proposal Q(θt , ϑ), e.g. to control the acceptance rate, as in other adaptive MCMC algorithms. Then could η be adaptive, since it does as well depend on the (unknown) scale of the posterior distribution of θ ? How would the current theoretical framework cope with these additional adaptations? In the same spirit, could m (and m0 ) be considered as algorithmic parameters and be adapted as well? What criterion would be used to adapt them? Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 6/ 7
  • 13. Sample size and acceptance rate Another MH algorithm where the “true” acceptance ratio is replaced by a ratio with an unbiased estimator is the Particle MCMC algorithm. Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 7/ 7
  • 14. Sample size and acceptance rate Another MH algorithm where the “true” acceptance ratio is replaced by a ratio with an unbiased estimator is the Particle MCMC algorithm. In PMCMC, the acceptance rate is growing with the number of particles used to estimate the likelihood, ie when the estimation is more precise, the acceptance rate increases and converges towards the acceptance rate of an idealized standard MH algorithm. Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 7/ 7
  • 15. Sample size and acceptance rate Another MH algorithm where the “true” acceptance ratio is replaced by a ratio with an unbiased estimator is the Particle MCMC algorithm. In PMCMC, the acceptance rate is growing with the number of particles used to estimate the likelihood, ie when the estimation is more precise, the acceptance rate increases and converges towards the acceptance rate of an idealized standard MH algorithm. In adaptive MCMH, is there such a link betweeen m, m0 and the number of iterations on one side and the acceptance rate of the algorithm on the other side? Should it be growing when the estimation becomes more and more precise? Adaptive MCMH for Bayesian inference of spatial autologistic mo discussion by Pierre E. Jacob & Christian P. Robert 7/ 7