SlideShare une entreprise Scribd logo
1  sur  90
Bayesian Phylogenetic
Inference: Introduction
Fred(rik) Ronquist
Swedish Museum of Natural History,
Stockholm, Sweden
BIG 4 Workshop, October 9-19, Tovetorp, Sweden
Topics
 Probability 101
 Bayesian Phylogenetic Inference
 Markov chain Monte Carlo
 Bayesian Model Choice and Model
Averaging
1. Probability 101
Do not worry about your
difficulties in Mathematics. I
can assure you that mine
are still greater.
Albert Einstein
α Uniform distribution
β
γ
δ
ε
Beta distribution
Gamma distribution
Dirichlet distribution
Exponential distribution
Continuous probability distributions
DiscreteProbability
Sample spaceω1 ω2 ω3 ω4 ω5 ω6
E = ω1,ω3,ω5{ } Event (subset of outcomes;
e.g., face with odd number)
Distribution function
(E.g., uniform)
m(ω)
Pr(E) = m(ω)
ω ∈Ε
∑ Probability
Random variable X
ContinuousProbability
Sample space
(an interval)
Disc with
circumference 1
f (x) Probability density
function (pdf)
(e.g. Uniform(0,1))
Pr(E) = f (x)
x∈E
∫ dx Probability
E = a,b[ ) Event (a subspace of the
sample space)
a b
Random variable X
ProbabilityDistributions
• Uniform distribution
• Beta distribution
• Gamma distribution
• Dirichlet distribution
• Exponential distribution
• Normal distribution
• Lognormal distribution
• Multivariate normal distribution
Continuous Distributions
ProbabilityDistributions
• Bernoulli distribution
• Categorical distribution
• Binomial distribution
• Multinomial distribution
• Poisson distribution
Discrete Distributions
StochasticProcesses
• Markov chain
• Poisson process
• Birth-death process
• Coalescence
• Dirichlet Process Mixture
Stochastic Processes
ExponentialDistribution
f (x) = λe−λx
Mean: 1/λ
λ = rate (of decay)
Exp(λ)X ~
Parameters:
Probability density function:
Exponential distribution
GammaDistribution
f (x) µ xα−1
e−βx
Mean: α /β
α = shape
Gamma(α,β)X ~
Parameters:
Probability density function:
Gamma distribution
β = inverse scale
Scaled gamma: α = β
Scaled Gamma
BetaDistribution
f (x) µ xα1 −1
(1− x)α2 −1
Mode:
α1 −1
αi −1( )
i
∑
α1,α2 = shape parameters
Beta(α1,α2)X ~
Parameters:
Probability density function:
Beta distribution
Defined on two proportions of a whole
(a simplex)
DirichletDistribution
f (x) µ xi
αi −1
i
∏
α = vector of k shape parameters
Dir(α) :α = α1,α2,...,αk{ }X ~
Parameters:
Probability density function:
Dirichlet distribution
Defined on k proportions of a whole
Dir(1,1,1,1)
Dir(300,300,300,300)
ConditionalProbability
Pr(H) =
4
10
= 0.4
Pr(D) =
3
10
= 0.3
Pr(D,H) =
2
10
= 0.2Joint probability:
Pr(D | H) =
2
4
= 0.5Conditional probability:
Bayes’Rule
Reverend Thomas Bayes
(1701-1760)
Pr(A | B) ⇒ Pr(B | A) ?
Bayes’Rule
Pr(D,H) = Pr(D)Pr(H | D)
= Pr(H)Pr(D | H)
Pr(D)Pr(H | D) = Pr(H)Pr(D | H)
Bayes’ rulePr(H | D) =
Pr(H)Pr(D | H)
Pr(D)
=
3
10
×
2
3
=
2
10
= 0.2
=
4
10
×
2
4
=
2
10
= 0.2
MaximumLikelihood
Pr(D |θ)
Maximum Likelihood Inference
Data D; Model M with parameters θ
We can calculate or f (D |θ)
Maximum likelihood: find the value of θ that
maximizes L(θ)
Define the likelihood function L(θ)µ f (D |θ)
Confidence: asymptotic behavior, more
samples, bootstrapping
BayesianInference
Pr(D |θ)
Bayesian Inference
Data D; Model M with parameters θ
We can calculate or f (D |θ)
We are actually interested in Pr(θ | D) or f (θ | D)
Bayes’ rule:
f (θ | D) =
f (θ) f (D |θ)
f (D)
Posterior density
Prior density
”Likelihood”
Normalizing constant
Marginal likelihood of the data
Model likelihood
f (D) = f (θ)∫ f (D |θ) dθ
Coin Tossing Example
What is the probability of
your favorite team winning
the next ice hockey World
Championships?
Gold Silver Bronze
2012
2013
2014
World Championship Medalists
other
2
5
8
1
5
5
3
1
Medals
2011
2007
2008
2009
2010
2015
2016
other
2
5
8
1
5
5
3
1
Prior
( )f θ
other
2
0
8
0
0
5
3
0
Posterior 1
1( | )f Dθ
other
0
0
0
0
0
5
0
0
Posterior 2
2( | )f Dθ
other
in
out
in
out
out
in
in
out
Data 1
other
out
out
out
out
out
won
out
out
Data 2
( )f θ 1 2( | )f D Dθ +
Learn more:
• Wikipedia (good texts on most statistical distributions,
sometimes a little difficult)
• Grinstead & Snell: Introduction to Probability.
American Mathematical Society. Free pdf available
from:
http://www.dartmouth.edu/~chance/teaching_aids/books_art
• Team up with a statistician or a computational /
theoretical evolutionary biologist!
2. Bayesian Phylogenetic
Inference
Infer relationships among three species:
Outgroup:
Three possible trees (topologies):
A
B
C
A B C
Prior distribution
probability
1.0
Posterior distribution
probability
1.0
Data (observations)
Model
D The data
Taxon Characters
A ACG TTA TTA AAT TGT CCT CTT TTC AGA
B ACG TGT TTC GAT CGT CCT CTT TTC AGA
C ACG TGT TTA GAC CGA CCT CGG TTA AGG
D ACA GGA TTA GAT CGT CCG CTT TTC AGA
Model: topology AND branch lengths
θ Parameters
topology )(τ
branch lengths )( iv
A
B
3v
C
D
2v
1v
4v
5v
(expected amount of change)
),( vτθ =
Model: molecular evolution
θ Parameters
instantaneous rate matrix
(Jukes-Cantor)
Q =
[A] [C] [G] [T]
[A] − µ µ µ
[C] µ − µ µ
[G] µ µ − µ
[T] µ µ µ −









÷
÷
÷
÷
÷÷
Model: molecular evolution
Probabilities are calculated using the transition
probability matrix P
4 /3
4 /3
1 1
4 4( )
1 3
4 4
v
Qv
v
e
P v e
e
−
−

−
= = 
 +

(change)
(no change)
Priors on parameters
 Topology
 all unique topologies have equal
probability
 Branch lengths
 exponential prior (puts more weight on
small branch lengths)
4 /3v
p k ke−
= −v
Exp(1)v :
Exp(1)v :
The effect on data likelihood is most important
Jeffrey’s uninformative priors formalize this
Branch length Prob. of substitution
Scale matters in priors
Bayes’ theorem
( ) ( | )
( | )
( ) ( | ) d
f f D
f D
f f D
θ θ
θ
θ θ θ
=
∫
Posterior
distribution
Prior distribution ”Likelihood”
Normalizing constant
D = Data
θ = Model parameters
tree 1 tree 2 tree 3
θ
)|( Xf θ
Posterior probability distribution
Parameter space
Posteriorprobability
tree 1 tree 2 tree 3
20% 48% 32%
We can focus on any parameter of interest
(there are no nuisance parameters) by
marginalizing the posterior over the other
parameters (integrating out the
uncertainty in the other parameters)
(Percentages denote marginal probability distribution on trees)
32.048.020.0
38.014.019.005.0
33.006.022.005.0
29.012.007.010.0
3
2
1
321
ν
ν
ν
τττ
joint probabilities
marginal probabilities
Why is it called marginalizing?
trees
branchlengthvectors
tree 1 tree 2 tree 3
always accept
accept sometimes
 Start at an arbitrary point
 Make a small random move
 Calculate height ratio (r) of new state to old state:
 r > 1 -> new state accepted
 r < 1 -> new state accepted with probability r. If new
state not accepted, stay in the old state
 Go to step 2
Markov chain Monte Carlo
The proportion of time the
MCMC procedure samples
from a particular parameter
region is an estimate of that
region’s posterior
probability density
1
2b
2a
20 % 48 % 32 %
Metropolis algorithm
Assume that the current state has
parameter values θ
Consider a move to a state with parameter
values θ *
The height ratio r is
(prior ratio x likelihood ratio)
r =
f (θ*
| D)
f (θ | D)
=
f (θ*
) f (D |θ*
)/ f (D)
f (θ) f (D |θ)/ f (D)
=
f (θ*
)
f (θ)
×
f (D |θ*
)
f (D |θ)
MCMC Sampling Strategies
 Great freedom of strategies:
 Typically one or a few related parameters
changed at a time
 You can cycle through parameters
systematically or choose randomly
 One ”generation” or ”iteration” or ”cycle” can
include a single randomly chosen proposal (or
move, operator, kernel), one proposal for each
parameter, a block of randomly chosen
proposals
Trace Plot
burn-in
stationary phase sampled with thinning
(rapid mixing essential)
Majority rule
consensus tree
Frequencies
represent the
posterior
probability of
the clades
Probability of
clade being true
given data and
model
Summarizing Trees
 Maximum posterior probability tree (MAP tree)
 can be difficult to estimate precisely
 can have low probability
 Majority rule consensus tree
 easier to estimate clade probabilities exactly
 branch length distributions can be summarized across all
trees with the branch
 can hide complex topological dependence
 branch length distributions can be multimodal
 Credible sets of trees
 Include trees in order of decreasing probability to
obtain, e.g., 95 % credible set
 “Median” or “central” tree
Adding Model Complexity
A
B
topology General Time Reversible
substitution model
)(τ
÷
÷
÷
÷
÷









−
−
−
−
=
GTGCTCATA
GTTCGCAGA
CTTCGGACA
ATTAGGACC
rrr
rrr
rrr
rrr
Q
πππ
πππ
πππ
πππ
3v
C
D
2v
1v
4v
5v
branch lengths )( iv
Adding Model Complexity
Gamma-shaped
rate variation
across sites
Priors on Parameters
 Stationary state frequencies
 Flat Dirichlet, Dir(1,1,1,1)
 Exchangeability parameters
 Flat Dirichlet, Dir(1,1,1,1,1,1)
 Shape parameter of scaled gamma
distribution of rate variation across
sites
 Uniform Uni(0,50)
Mean and 95%
credibility
interval for model
parameters
Summarizing Variables
 Mean, median, variance common
summaries
 95 % credible interval: discard the
lowest 2.5 % and highest 2.5 % of
sampled values
 95 % region of highest posterior
density (HPD): find smallest region
containing 95 % of probability
Credible interval
HPD
HPD HPD
Credible intervals and HPDs
Other Sampling Methods
 Gibbs sampling: sample from the conditional posterior (a variant of
the Metropolis algorithm)
 Metropolized Gibbs sampling: more efficient variant of Gibbs
sampling of discrete characters
 Slice sampling: less prone to get stuck in local optima than the
Metropolis algorithm
 Hamiltonian sampling. A technique for decreasing the problem with
sampling correlated parameters.
 Simulated annealing: increase ”greediness” during the burn-in
phase of MCMC sampling
 Data augmentation techniques: add parameters to facilitate
probability calculations
 Sequential Monte Carlo techniques: generate a sample of complete
state by building sets of particles from incomplete states
Sequential Monte Carlo Algorithm for Phylogenetics
Bouchard et al. 2012. Syst. Biol.
3. Markov chain Monte
Carlo
Convergence and Mixing
 Convergence is the degree to which
the chain has converged onto the
target distribution
 Mixing is the speed with which the
chain covers the region of interest in
the target distribution
Assessing Convergence
 Plateau in the trace plot
 Look at sampling behavior within the
run (autocorrelation time, effective
sample size)
 Compare independent runs with
different, randomly chosen starting
points
Convergence within Run
t =1+ 2 ρk (θ)
k=1
∞
∑Autocorrelation time
where ρk(θ) is the autocorrelation in the MCMC
samples for a lag of k generations
Effective sample size (ESS) e =
n
t
where n is the total sample size (number of
generations)
Good mixing when t is small and e large
Convergence among Runs
 Tree topology:
 Compare clade probabilities (split frequencies)
 Average standard deviation of split frequencies
above some cut-off (min. 10 % in at least one
run). Should go to 0 as runs converge.
 Continuous variables
 Potential scale reduction factor (PSRF).
Compares variance within and between runs.
Should approach 1 as runs converge.
 Assumes overdispersed starting points
Cladeprobabilityinanalysis2
Clade probability in analysis 1
Cladeprobabilityinanalysis2
Clade probability in analysis 1
Sampledvalue Target distribution
Too modest proposals
Acceptance rate too high
Poor mixing
Too bold proposals
Acceptance rate too low
Poor mixing
Moderately bold proposals
Acceptance rate intermediate
Good mixing
Tuning Proposals
 Manually by changing tuning parameters
 Increase the boldness of a proposal if
acceptance rate is too high
 Decrease the boldness of a proposal if
acceptance rate is too low
 Auto-tuning
 Tuning parameters are adjusted automatically
by the MCMC procedure to reach a target
proposal rate
cold chain
heated chain
Metropolis-
coupled
Markov chain
Monte Carlo
a. k. a.
MCMCMC
a. k. a.
(MC)3
cold chain
hot chain
cold chain
hot chain
cold chain
hot chain
unsuccessful swap
cold chain
hot chain
cold chain
hot chain
cold chain
hot chain
cold chain
hot chain
successful swap
cold chain
hot chain
cold chain
hot chain
cold chain
hot chain
successful swap
cold chain
hot chain
Incremental Heating
( )iT λ+= 1/1 { }1,...,1,0 −= ni
( )
( )
( )
( ) 62.0
71.0
83.0
00.1
|62.03
|71.02
|83.01
|00.10
Distr.
Xf
Xf
Xf
Xf
Ti
θ
θ
θ
θ
T is temperature, λ is heating coefficient
Example for λ = 0.2:
cold chain
heated chains
4. Bayesian Model Choice
Bayesian Model Sensitivity
Models, models, models
 Alignment-free models
 Heterogeneity in substitution rates and stationary
frequencies across sites and lineages
 Relaxed clock models
 Models for morphology and biogeography
 Sampling across model space, e.g. GTR space and
partition space
 Models of dependence across sites according to
3D structure of proteins
 Positive selection models
 Aminoacid models
 Models for population genetics and phylogeography
Bayes’ Rule
f (θ | D) =
f (θ) f (D |θ)
f (θ) f (D |θ) dθ∫
=
f (θ) f (D |θ)
f (D)
Marginal likelihood (of the data)
f (θ | D,M) =
f (θ | M) f (D |θ,M)
f (D | M)
We have implicitly conditioned on a model:
Bayesian Model Choice
f M1( ) f D | M1( )
f M0( ) f D | M0( )
Posterior model odds:
B10 =
f D | M1( )
f D | M0( )
Bayes factor:
Bayesian Model Choice
 The normalizing constant in Bayes’ theorem, the
marginal likelihood of the data, f(D) or f(D|M), can
be used for model choice
 f(D|M) can be estimated by taking the harmonic
mean of the likelihood values from the MCMC run.
Thermodynamic integration and stepping-stone
sampling are computationally more complex but
more accurate methods
 Any models can be compared: nested, non-nested,
data-derived; it is just a probability comparison
 No correction for number of parameters
 Can prefer a simpler model over a more complex
model
 Critical values in Kass and Raftery (1997)
Simple Model Wins (from Lewis, 2008)
Bayes Factor Comparisons
Interpretation of the Bayes factor
2ln(B10) B10 Evidence against M0
0 to 2 1 to 3
Not worth more than a
bare mention
2 to 6 3 to 20 Positive
6 to 10 20 to 150 Strong
> 10 > 150 Very strong
Bayesian Software
 Model testing
 ModelTest
 MrModelTest
 MrAIC
 Convergence diagnostics
 AWTY
 Tracer
 Phylogenetic inference
 MrBayes
 BEAST
 BayesPhylogenies
 PhyloBayes
 Phycas
 BAMBE
 RevBayes
 Specialized inference
 PHASE
 BAliPhy
 BayesTraits
 Badger
 BEST
 *BEAST
 CoEvol
 Tree drawing
 TreeView
 FigTree
Listening to lectures,
after a certain age,
diverts the mind too much
from its creative pursuits.
Any scientist who attends
too many lectures and
uses her own brain too
little falls into lazy habits
of thinking.
after Albert Einstein

Contenu connexe

Tendances

Introduction to Bayesian Statistics
Introduction to Bayesian StatisticsIntroduction to Bayesian Statistics
Introduction to Bayesian StatisticsPhilipp Singer
 
Phylogenetic tree and its construction and phylogeny of
Phylogenetic tree and its construction and phylogeny ofPhylogenetic tree and its construction and phylogeny of
Phylogenetic tree and its construction and phylogeny ofbhavnesthakur
 
Hardy weinberg powerpoint
Hardy weinberg powerpointHardy weinberg powerpoint
Hardy weinberg powerpointjhadachek
 
Introduction to HMMs in Bioinformatics
Introduction to HMMs in BioinformaticsIntroduction to HMMs in Bioinformatics
Introduction to HMMs in Bioinformaticsavrilcoghlan
 
Phylogenetic Tree, types and Applicantion
Phylogenetic Tree, types and Applicantion Phylogenetic Tree, types and Applicantion
Phylogenetic Tree, types and Applicantion Faisal Hussain
 
Molecular clock, Neutral hypothesis
Molecular clock, Neutral hypothesisMolecular clock, Neutral hypothesis
Molecular clock, Neutral hypothesisFreya Cardozo
 
Gene tree-species tree methods in RevBayes
Gene tree-species tree methods in RevBayesGene tree-species tree methods in RevBayes
Gene tree-species tree methods in RevBayesboussau
 
Distance based method
Distance based method Distance based method
Distance based method Adhena Lulli
 
Population genetics
Population geneticsPopulation genetics
Population geneticsLubnaSSubair
 
Introduction to systems biology
Introduction to systems biologyIntroduction to systems biology
Introduction to systems biologylemberger
 
Phylogenetic trees
Phylogenetic treesPhylogenetic trees
Phylogenetic treesmartyynyyte
 
Phylogenetic tree construction
Phylogenetic tree constructionPhylogenetic tree construction
Phylogenetic tree constructionUddalok Jana
 
Molecular Evolution and Phylogenetics (2009)
Molecular Evolution and Phylogenetics (2009)Molecular Evolution and Phylogenetics (2009)
Molecular Evolution and Phylogenetics (2009)Hernán Dopazo
 
The Method Of Maximum Likelihood
The Method Of Maximum LikelihoodThe Method Of Maximum Likelihood
The Method Of Maximum LikelihoodMax Chipulu
 

Tendances (20)

Introduction to Bayesian phylogenetics and BEAST
Introduction to Bayesian phylogenetics and BEASTIntroduction to Bayesian phylogenetics and BEAST
Introduction to Bayesian phylogenetics and BEAST
 
UPGMA
UPGMAUPGMA
UPGMA
 
Phylogenetic analysis
Phylogenetic analysisPhylogenetic analysis
Phylogenetic analysis
 
Introduction to Bayesian Statistics
Introduction to Bayesian StatisticsIntroduction to Bayesian Statistics
Introduction to Bayesian Statistics
 
Phylogenetic tree and its construction and phylogeny of
Phylogenetic tree and its construction and phylogeny ofPhylogenetic tree and its construction and phylogeny of
Phylogenetic tree and its construction and phylogeny of
 
Hardy weinberg powerpoint
Hardy weinberg powerpointHardy weinberg powerpoint
Hardy weinberg powerpoint
 
PHYLOGENETICS WITH MEGA
PHYLOGENETICS WITH MEGAPHYLOGENETICS WITH MEGA
PHYLOGENETICS WITH MEGA
 
Tree building 2
Tree building 2Tree building 2
Tree building 2
 
Introduction to HMMs in Bioinformatics
Introduction to HMMs in BioinformaticsIntroduction to HMMs in Bioinformatics
Introduction to HMMs in Bioinformatics
 
Phylogenetic Tree, types and Applicantion
Phylogenetic Tree, types and Applicantion Phylogenetic Tree, types and Applicantion
Phylogenetic Tree, types and Applicantion
 
Molecular clock, Neutral hypothesis
Molecular clock, Neutral hypothesisMolecular clock, Neutral hypothesis
Molecular clock, Neutral hypothesis
 
Gene tree-species tree methods in RevBayes
Gene tree-species tree methods in RevBayesGene tree-species tree methods in RevBayes
Gene tree-species tree methods in RevBayes
 
Distance based method
Distance based method Distance based method
Distance based method
 
Population genetics
Population geneticsPopulation genetics
Population genetics
 
Introduction to systems biology
Introduction to systems biologyIntroduction to systems biology
Introduction to systems biology
 
Phylogenetic trees
Phylogenetic treesPhylogenetic trees
Phylogenetic trees
 
Blast Algorithm
Blast AlgorithmBlast Algorithm
Blast Algorithm
 
Phylogenetic tree construction
Phylogenetic tree constructionPhylogenetic tree construction
Phylogenetic tree construction
 
Molecular Evolution and Phylogenetics (2009)
Molecular Evolution and Phylogenetics (2009)Molecular Evolution and Phylogenetics (2009)
Molecular Evolution and Phylogenetics (2009)
 
The Method Of Maximum Likelihood
The Method Of Maximum LikelihoodThe Method Of Maximum Likelihood
The Method Of Maximum Likelihood
 

Similaire à Bayesian phylogenetic inference_big4_ws_2016-10-10

Unbiased Bayes for Big Data
Unbiased Bayes for Big DataUnbiased Bayes for Big Data
Unbiased Bayes for Big DataChristian Robert
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)Christian Robert
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationChristian Robert
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Valentin De Bortoli
 
An investigation of inference of the generalized extreme value distribution b...
An investigation of inference of the generalized extreme value distribution b...An investigation of inference of the generalized extreme value distribution b...
An investigation of inference of the generalized extreme value distribution b...Alexander Decker
 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinChristian Robert
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsChristian Robert
 
Inria Tech Talk - La classification de données complexes avec MASSICCC
Inria Tech Talk - La classification de données complexes avec MASSICCCInria Tech Talk - La classification de données complexes avec MASSICCC
Inria Tech Talk - La classification de données complexes avec MASSICCCStéphanie Roger
 
Jörg Stelzer
Jörg StelzerJörg Stelzer
Jörg Stelzerbutest
 
Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...Asma Ben Slimene
 
Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...Asma Ben Slimene
 
Bayesian Deep Learning
Bayesian Deep LearningBayesian Deep Learning
Bayesian Deep LearningRayKim51
 
NBBC15, Reyjavik, June 08, 2015
NBBC15, Reyjavik, June 08, 2015NBBC15, Reyjavik, June 08, 2015
NBBC15, Reyjavik, June 08, 2015Christian Robert
 
Yandex wg-talk
Yandex wg-talkYandex wg-talk
Yandex wg-talkYandex
 

Similaire à Bayesian phylogenetic inference_big4_ws_2016-10-10 (20)

Unbiased Bayes for Big Data
Unbiased Bayes for Big DataUnbiased Bayes for Big Data
Unbiased Bayes for Big Data
 
Presentation.pdf
Presentation.pdfPresentation.pdf
Presentation.pdf
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimation
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...
 
Program on Mathematical and Statistical Methods for Climate and the Earth Sys...
Program on Mathematical and Statistical Methods for Climate and the Earth Sys...Program on Mathematical and Statistical Methods for Climate and the Earth Sys...
Program on Mathematical and Statistical Methods for Climate and the Earth Sys...
 
An investigation of inference of the generalized extreme value distribution b...
An investigation of inference of the generalized extreme value distribution b...An investigation of inference of the generalized extreme value distribution b...
An investigation of inference of the generalized extreme value distribution b...
 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael Martin
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
 
Inria Tech Talk - La classification de données complexes avec MASSICCC
Inria Tech Talk - La classification de données complexes avec MASSICCCInria Tech Talk - La classification de données complexes avec MASSICCC
Inria Tech Talk - La classification de données complexes avec MASSICCC
 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
 
Jörg Stelzer
Jörg StelzerJörg Stelzer
Jörg Stelzer
 
Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...Research internship on optimal stochastic theory with financial application u...
Research internship on optimal stochastic theory with financial application u...
 
Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...Presentation on stochastic control problem with financial applications (Merto...
Presentation on stochastic control problem with financial applications (Merto...
 
Input analysis
Input analysisInput analysis
Input analysis
 
MUMS Opening Workshop - An Overview of Reduced-Order Models and Emulators (ED...
MUMS Opening Workshop - An Overview of Reduced-Order Models and Emulators (ED...MUMS Opening Workshop - An Overview of Reduced-Order Models and Emulators (ED...
MUMS Opening Workshop - An Overview of Reduced-Order Models and Emulators (ED...
 
Bayesian Deep Learning
Bayesian Deep LearningBayesian Deep Learning
Bayesian Deep Learning
 
NBBC15, Reyjavik, June 08, 2015
NBBC15, Reyjavik, June 08, 2015NBBC15, Reyjavik, June 08, 2015
NBBC15, Reyjavik, June 08, 2015
 
Yandex wg-talk
Yandex wg-talkYandex wg-talk
Yandex wg-talk
 

Dernier

High Class Escorts in Hyderabad ₹7.5k Pick Up & Drop With Cash Payment 969456...
High Class Escorts in Hyderabad ₹7.5k Pick Up & Drop With Cash Payment 969456...High Class Escorts in Hyderabad ₹7.5k Pick Up & Drop With Cash Payment 969456...
High Class Escorts in Hyderabad ₹7.5k Pick Up & Drop With Cash Payment 969456...chandars293
 
Formation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksFormation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksSérgio Sacani
 
❤Jammu Kashmir Call Girls 8617697112 Personal Whatsapp Number 💦✅.
❤Jammu Kashmir Call Girls 8617697112 Personal Whatsapp Number 💦✅.❤Jammu Kashmir Call Girls 8617697112 Personal Whatsapp Number 💦✅.
❤Jammu Kashmir Call Girls 8617697112 Personal Whatsapp Number 💦✅.Nitya salvi
 
Chemical Tests; flame test, positive and negative ions test Edexcel Internati...
Chemical Tests; flame test, positive and negative ions test Edexcel Internati...Chemical Tests; flame test, positive and negative ions test Edexcel Internati...
Chemical Tests; flame test, positive and negative ions test Edexcel Internati...ssuser79fe74
 
COST ESTIMATION FOR A RESEARCH PROJECT.pptx
COST ESTIMATION FOR A RESEARCH PROJECT.pptxCOST ESTIMATION FOR A RESEARCH PROJECT.pptx
COST ESTIMATION FOR A RESEARCH PROJECT.pptxFarihaAbdulRasheed
 
PSYCHOSOCIAL NEEDS. in nursing II sem pptx
PSYCHOSOCIAL NEEDS. in nursing II sem pptxPSYCHOSOCIAL NEEDS. in nursing II sem pptx
PSYCHOSOCIAL NEEDS. in nursing II sem pptxSuji236384
 
Module for Grade 9 for Asynchronous/Distance learning
Module for Grade 9 for Asynchronous/Distance learningModule for Grade 9 for Asynchronous/Distance learning
Module for Grade 9 for Asynchronous/Distance learninglevieagacer
 
Proteomics: types, protein profiling steps etc.
Proteomics: types, protein profiling steps etc.Proteomics: types, protein profiling steps etc.
Proteomics: types, protein profiling steps etc.Silpa
 
GBSN - Microbiology (Unit 3)
GBSN - Microbiology (Unit 3)GBSN - Microbiology (Unit 3)
GBSN - Microbiology (Unit 3)Areesha Ahmad
 
biology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGYbiology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGY1301aanya
 
Connaught Place, Delhi Call girls :8448380779 Model Escorts | 100% verified
Connaught Place, Delhi Call girls :8448380779 Model Escorts | 100% verifiedConnaught Place, Delhi Call girls :8448380779 Model Escorts | 100% verified
Connaught Place, Delhi Call girls :8448380779 Model Escorts | 100% verifiedDelhi Call girls
 
GBSN - Microbiology (Unit 2)
GBSN - Microbiology (Unit 2)GBSN - Microbiology (Unit 2)
GBSN - Microbiology (Unit 2)Areesha Ahmad
 
IDENTIFICATION OF THE LIVING- forensic medicine
IDENTIFICATION OF THE LIVING- forensic medicineIDENTIFICATION OF THE LIVING- forensic medicine
IDENTIFICATION OF THE LIVING- forensic medicinesherlingomez2
 
Introduction,importance and scope of horticulture.pptx
Introduction,importance and scope of horticulture.pptxIntroduction,importance and scope of horticulture.pptx
Introduction,importance and scope of horticulture.pptxBhagirath Gogikar
 
9999266834 Call Girls In Noida Sector 22 (Delhi) Call Girl Service
9999266834 Call Girls In Noida Sector 22 (Delhi) Call Girl Service9999266834 Call Girls In Noida Sector 22 (Delhi) Call Girl Service
9999266834 Call Girls In Noida Sector 22 (Delhi) Call Girl Servicenishacall1
 
9654467111 Call Girls In Raj Nagar Delhi Short 1500 Night 6000
9654467111 Call Girls In Raj Nagar Delhi Short 1500 Night 60009654467111 Call Girls In Raj Nagar Delhi Short 1500 Night 6000
9654467111 Call Girls In Raj Nagar Delhi Short 1500 Night 6000Sapana Sha
 
Vip profile Call Girls In Lonavala 9748763073 For Genuine Sex Service At Just...
Vip profile Call Girls In Lonavala 9748763073 For Genuine Sex Service At Just...Vip profile Call Girls In Lonavala 9748763073 For Genuine Sex Service At Just...
Vip profile Call Girls In Lonavala 9748763073 For Genuine Sex Service At Just...Monika Rani
 
Pests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdfPests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdfPirithiRaju
 
module for grade 9 for distance learning
module for grade 9 for distance learningmodule for grade 9 for distance learning
module for grade 9 for distance learninglevieagacer
 

Dernier (20)

High Class Escorts in Hyderabad ₹7.5k Pick Up & Drop With Cash Payment 969456...
High Class Escorts in Hyderabad ₹7.5k Pick Up & Drop With Cash Payment 969456...High Class Escorts in Hyderabad ₹7.5k Pick Up & Drop With Cash Payment 969456...
High Class Escorts in Hyderabad ₹7.5k Pick Up & Drop With Cash Payment 969456...
 
Formation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksFormation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disks
 
❤Jammu Kashmir Call Girls 8617697112 Personal Whatsapp Number 💦✅.
❤Jammu Kashmir Call Girls 8617697112 Personal Whatsapp Number 💦✅.❤Jammu Kashmir Call Girls 8617697112 Personal Whatsapp Number 💦✅.
❤Jammu Kashmir Call Girls 8617697112 Personal Whatsapp Number 💦✅.
 
Chemical Tests; flame test, positive and negative ions test Edexcel Internati...
Chemical Tests; flame test, positive and negative ions test Edexcel Internati...Chemical Tests; flame test, positive and negative ions test Edexcel Internati...
Chemical Tests; flame test, positive and negative ions test Edexcel Internati...
 
COST ESTIMATION FOR A RESEARCH PROJECT.pptx
COST ESTIMATION FOR A RESEARCH PROJECT.pptxCOST ESTIMATION FOR A RESEARCH PROJECT.pptx
COST ESTIMATION FOR A RESEARCH PROJECT.pptx
 
PSYCHOSOCIAL NEEDS. in nursing II sem pptx
PSYCHOSOCIAL NEEDS. in nursing II sem pptxPSYCHOSOCIAL NEEDS. in nursing II sem pptx
PSYCHOSOCIAL NEEDS. in nursing II sem pptx
 
Module for Grade 9 for Asynchronous/Distance learning
Module for Grade 9 for Asynchronous/Distance learningModule for Grade 9 for Asynchronous/Distance learning
Module for Grade 9 for Asynchronous/Distance learning
 
Proteomics: types, protein profiling steps etc.
Proteomics: types, protein profiling steps etc.Proteomics: types, protein profiling steps etc.
Proteomics: types, protein profiling steps etc.
 
GBSN - Microbiology (Unit 3)
GBSN - Microbiology (Unit 3)GBSN - Microbiology (Unit 3)
GBSN - Microbiology (Unit 3)
 
biology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGYbiology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGY
 
Connaught Place, Delhi Call girls :8448380779 Model Escorts | 100% verified
Connaught Place, Delhi Call girls :8448380779 Model Escorts | 100% verifiedConnaught Place, Delhi Call girls :8448380779 Model Escorts | 100% verified
Connaught Place, Delhi Call girls :8448380779 Model Escorts | 100% verified
 
GBSN - Microbiology (Unit 2)
GBSN - Microbiology (Unit 2)GBSN - Microbiology (Unit 2)
GBSN - Microbiology (Unit 2)
 
IDENTIFICATION OF THE LIVING- forensic medicine
IDENTIFICATION OF THE LIVING- forensic medicineIDENTIFICATION OF THE LIVING- forensic medicine
IDENTIFICATION OF THE LIVING- forensic medicine
 
Introduction,importance and scope of horticulture.pptx
Introduction,importance and scope of horticulture.pptxIntroduction,importance and scope of horticulture.pptx
Introduction,importance and scope of horticulture.pptx
 
9999266834 Call Girls In Noida Sector 22 (Delhi) Call Girl Service
9999266834 Call Girls In Noida Sector 22 (Delhi) Call Girl Service9999266834 Call Girls In Noida Sector 22 (Delhi) Call Girl Service
9999266834 Call Girls In Noida Sector 22 (Delhi) Call Girl Service
 
9654467111 Call Girls In Raj Nagar Delhi Short 1500 Night 6000
9654467111 Call Girls In Raj Nagar Delhi Short 1500 Night 60009654467111 Call Girls In Raj Nagar Delhi Short 1500 Night 6000
9654467111 Call Girls In Raj Nagar Delhi Short 1500 Night 6000
 
Vip profile Call Girls In Lonavala 9748763073 For Genuine Sex Service At Just...
Vip profile Call Girls In Lonavala 9748763073 For Genuine Sex Service At Just...Vip profile Call Girls In Lonavala 9748763073 For Genuine Sex Service At Just...
Vip profile Call Girls In Lonavala 9748763073 For Genuine Sex Service At Just...
 
Pests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdfPests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdf
 
module for grade 9 for distance learning
module for grade 9 for distance learningmodule for grade 9 for distance learning
module for grade 9 for distance learning
 
Clean In Place(CIP).pptx .
Clean In Place(CIP).pptx                 .Clean In Place(CIP).pptx                 .
Clean In Place(CIP).pptx .
 

Bayesian phylogenetic inference_big4_ws_2016-10-10

  • 1. Bayesian Phylogenetic Inference: Introduction Fred(rik) Ronquist Swedish Museum of Natural History, Stockholm, Sweden BIG 4 Workshop, October 9-19, Tovetorp, Sweden
  • 2. Topics  Probability 101  Bayesian Phylogenetic Inference  Markov chain Monte Carlo  Bayesian Model Choice and Model Averaging
  • 4. Do not worry about your difficulties in Mathematics. I can assure you that mine are still greater. Albert Einstein
  • 5.
  • 6. α Uniform distribution β γ δ ε Beta distribution Gamma distribution Dirichlet distribution Exponential distribution Continuous probability distributions
  • 7. DiscreteProbability Sample spaceω1 ω2 ω3 ω4 ω5 ω6 E = ω1,ω3,ω5{ } Event (subset of outcomes; e.g., face with odd number) Distribution function (E.g., uniform) m(ω) Pr(E) = m(ω) ω ∈Ε ∑ Probability Random variable X
  • 8. ContinuousProbability Sample space (an interval) Disc with circumference 1 f (x) Probability density function (pdf) (e.g. Uniform(0,1)) Pr(E) = f (x) x∈E ∫ dx Probability E = a,b[ ) Event (a subspace of the sample space) a b Random variable X
  • 9. ProbabilityDistributions • Uniform distribution • Beta distribution • Gamma distribution • Dirichlet distribution • Exponential distribution • Normal distribution • Lognormal distribution • Multivariate normal distribution Continuous Distributions
  • 10. ProbabilityDistributions • Bernoulli distribution • Categorical distribution • Binomial distribution • Multinomial distribution • Poisson distribution Discrete Distributions
  • 11. StochasticProcesses • Markov chain • Poisson process • Birth-death process • Coalescence • Dirichlet Process Mixture Stochastic Processes
  • 12.
  • 13. ExponentialDistribution f (x) = λe−λx Mean: 1/λ λ = rate (of decay) Exp(λ)X ~ Parameters: Probability density function: Exponential distribution
  • 14. GammaDistribution f (x) µ xα−1 e−βx Mean: α /β α = shape Gamma(α,β)X ~ Parameters: Probability density function: Gamma distribution β = inverse scale Scaled gamma: α = β Scaled Gamma
  • 15. BetaDistribution f (x) µ xα1 −1 (1− x)α2 −1 Mode: α1 −1 αi −1( ) i ∑ α1,α2 = shape parameters Beta(α1,α2)X ~ Parameters: Probability density function: Beta distribution Defined on two proportions of a whole (a simplex)
  • 16. DirichletDistribution f (x) µ xi αi −1 i ∏ α = vector of k shape parameters Dir(α) :α = α1,α2,...,αk{ }X ~ Parameters: Probability density function: Dirichlet distribution Defined on k proportions of a whole Dir(1,1,1,1) Dir(300,300,300,300)
  • 17. ConditionalProbability Pr(H) = 4 10 = 0.4 Pr(D) = 3 10 = 0.3 Pr(D,H) = 2 10 = 0.2Joint probability: Pr(D | H) = 2 4 = 0.5Conditional probability:
  • 19. Bayes’Rule Pr(D,H) = Pr(D)Pr(H | D) = Pr(H)Pr(D | H) Pr(D)Pr(H | D) = Pr(H)Pr(D | H) Bayes’ rulePr(H | D) = Pr(H)Pr(D | H) Pr(D) = 3 10 × 2 3 = 2 10 = 0.2 = 4 10 × 2 4 = 2 10 = 0.2
  • 20. MaximumLikelihood Pr(D |θ) Maximum Likelihood Inference Data D; Model M with parameters θ We can calculate or f (D |θ) Maximum likelihood: find the value of θ that maximizes L(θ) Define the likelihood function L(θ)µ f (D |θ) Confidence: asymptotic behavior, more samples, bootstrapping
  • 21. BayesianInference Pr(D |θ) Bayesian Inference Data D; Model M with parameters θ We can calculate or f (D |θ) We are actually interested in Pr(θ | D) or f (θ | D) Bayes’ rule: f (θ | D) = f (θ) f (D |θ) f (D) Posterior density Prior density ”Likelihood” Normalizing constant Marginal likelihood of the data Model likelihood f (D) = f (θ)∫ f (D |θ) dθ
  • 23.
  • 24. What is the probability of your favorite team winning the next ice hockey World Championships?
  • 25. Gold Silver Bronze 2012 2013 2014 World Championship Medalists other 2 5 8 1 5 5 3 1 Medals 2011 2007 2008 2009 2010 2015 2016
  • 26. other 2 5 8 1 5 5 3 1 Prior ( )f θ other 2 0 8 0 0 5 3 0 Posterior 1 1( | )f Dθ other 0 0 0 0 0 5 0 0 Posterior 2 2( | )f Dθ other in out in out out in in out Data 1 other out out out out out won out out Data 2 ( )f θ 1 2( | )f D Dθ +
  • 27. Learn more: • Wikipedia (good texts on most statistical distributions, sometimes a little difficult) • Grinstead & Snell: Introduction to Probability. American Mathematical Society. Free pdf available from: http://www.dartmouth.edu/~chance/teaching_aids/books_art • Team up with a statistician or a computational / theoretical evolutionary biologist!
  • 29. Infer relationships among three species: Outgroup:
  • 30. Three possible trees (topologies): A B C
  • 31. A B C Prior distribution probability 1.0 Posterior distribution probability 1.0 Data (observations) Model
  • 32. D The data Taxon Characters A ACG TTA TTA AAT TGT CCT CTT TTC AGA B ACG TGT TTC GAT CGT CCT CTT TTC AGA C ACG TGT TTA GAC CGA CCT CGG TTA AGG D ACA GGA TTA GAT CGT CCG CTT TTC AGA
  • 33. Model: topology AND branch lengths θ Parameters topology )(τ branch lengths )( iv A B 3v C D 2v 1v 4v 5v (expected amount of change) ),( vτθ =
  • 34. Model: molecular evolution θ Parameters instantaneous rate matrix (Jukes-Cantor) Q = [A] [C] [G] [T] [A] − µ µ µ [C] µ − µ µ [G] µ µ − µ [T] µ µ µ −          ÷ ÷ ÷ ÷ ÷÷
  • 35. Model: molecular evolution Probabilities are calculated using the transition probability matrix P 4 /3 4 /3 1 1 4 4( ) 1 3 4 4 v Qv v e P v e e − −  − = =   +  (change) (no change)
  • 36. Priors on parameters  Topology  all unique topologies have equal probability  Branch lengths  exponential prior (puts more weight on small branch lengths)
  • 37. 4 /3v p k ke− = −v Exp(1)v : Exp(1)v : The effect on data likelihood is most important Jeffrey’s uninformative priors formalize this Branch length Prob. of substitution Scale matters in priors
  • 38. Bayes’ theorem ( ) ( | ) ( | ) ( ) ( | ) d f f D f D f f D θ θ θ θ θ θ = ∫ Posterior distribution Prior distribution ”Likelihood” Normalizing constant D = Data θ = Model parameters
  • 39. tree 1 tree 2 tree 3 θ )|( Xf θ Posterior probability distribution Parameter space Posteriorprobability
  • 40. tree 1 tree 2 tree 3 20% 48% 32% We can focus on any parameter of interest (there are no nuisance parameters) by marginalizing the posterior over the other parameters (integrating out the uncertainty in the other parameters) (Percentages denote marginal probability distribution on trees)
  • 42. tree 1 tree 2 tree 3 always accept accept sometimes  Start at an arbitrary point  Make a small random move  Calculate height ratio (r) of new state to old state:  r > 1 -> new state accepted  r < 1 -> new state accepted with probability r. If new state not accepted, stay in the old state  Go to step 2 Markov chain Monte Carlo The proportion of time the MCMC procedure samples from a particular parameter region is an estimate of that region’s posterior probability density 1 2b 2a 20 % 48 % 32 %
  • 43. Metropolis algorithm Assume that the current state has parameter values θ Consider a move to a state with parameter values θ * The height ratio r is (prior ratio x likelihood ratio) r = f (θ* | D) f (θ | D) = f (θ* ) f (D |θ* )/ f (D) f (θ) f (D |θ)/ f (D) = f (θ* ) f (θ) × f (D |θ* ) f (D |θ)
  • 44. MCMC Sampling Strategies  Great freedom of strategies:  Typically one or a few related parameters changed at a time  You can cycle through parameters systematically or choose randomly  One ”generation” or ”iteration” or ”cycle” can include a single randomly chosen proposal (or move, operator, kernel), one proposal for each parameter, a block of randomly chosen proposals
  • 45.
  • 46.
  • 48. burn-in stationary phase sampled with thinning (rapid mixing essential)
  • 49. Majority rule consensus tree Frequencies represent the posterior probability of the clades Probability of clade being true given data and model
  • 50. Summarizing Trees  Maximum posterior probability tree (MAP tree)  can be difficult to estimate precisely  can have low probability  Majority rule consensus tree  easier to estimate clade probabilities exactly  branch length distributions can be summarized across all trees with the branch  can hide complex topological dependence  branch length distributions can be multimodal  Credible sets of trees  Include trees in order of decreasing probability to obtain, e.g., 95 % credible set  “Median” or “central” tree
  • 51. Adding Model Complexity A B topology General Time Reversible substitution model )(τ ÷ ÷ ÷ ÷ ÷          − − − − = GTGCTCATA GTTCGCAGA CTTCGGACA ATTAGGACC rrr rrr rrr rrr Q πππ πππ πππ πππ 3v C D 2v 1v 4v 5v branch lengths )( iv
  • 53. Priors on Parameters  Stationary state frequencies  Flat Dirichlet, Dir(1,1,1,1)  Exchangeability parameters  Flat Dirichlet, Dir(1,1,1,1,1,1)  Shape parameter of scaled gamma distribution of rate variation across sites  Uniform Uni(0,50)
  • 54. Mean and 95% credibility interval for model parameters
  • 55. Summarizing Variables  Mean, median, variance common summaries  95 % credible interval: discard the lowest 2.5 % and highest 2.5 % of sampled values  95 % region of highest posterior density (HPD): find smallest region containing 95 % of probability
  • 57. Other Sampling Methods  Gibbs sampling: sample from the conditional posterior (a variant of the Metropolis algorithm)  Metropolized Gibbs sampling: more efficient variant of Gibbs sampling of discrete characters  Slice sampling: less prone to get stuck in local optima than the Metropolis algorithm  Hamiltonian sampling. A technique for decreasing the problem with sampling correlated parameters.  Simulated annealing: increase ”greediness” during the burn-in phase of MCMC sampling  Data augmentation techniques: add parameters to facilitate probability calculations  Sequential Monte Carlo techniques: generate a sample of complete state by building sets of particles from incomplete states
  • 58. Sequential Monte Carlo Algorithm for Phylogenetics Bouchard et al. 2012. Syst. Biol.
  • 59. 3. Markov chain Monte Carlo
  • 60. Convergence and Mixing  Convergence is the degree to which the chain has converged onto the target distribution  Mixing is the speed with which the chain covers the region of interest in the target distribution
  • 61. Assessing Convergence  Plateau in the trace plot  Look at sampling behavior within the run (autocorrelation time, effective sample size)  Compare independent runs with different, randomly chosen starting points
  • 62. Convergence within Run t =1+ 2 ρk (θ) k=1 ∞ ∑Autocorrelation time where ρk(θ) is the autocorrelation in the MCMC samples for a lag of k generations Effective sample size (ESS) e = n t where n is the total sample size (number of generations) Good mixing when t is small and e large
  • 63. Convergence among Runs  Tree topology:  Compare clade probabilities (split frequencies)  Average standard deviation of split frequencies above some cut-off (min. 10 % in at least one run). Should go to 0 as runs converge.  Continuous variables  Potential scale reduction factor (PSRF). Compares variance within and between runs. Should approach 1 as runs converge.  Assumes overdispersed starting points
  • 66. Sampledvalue Target distribution Too modest proposals Acceptance rate too high Poor mixing Too bold proposals Acceptance rate too low Poor mixing Moderately bold proposals Acceptance rate intermediate Good mixing
  • 67. Tuning Proposals  Manually by changing tuning parameters  Increase the boldness of a proposal if acceptance rate is too high  Decrease the boldness of a proposal if acceptance rate is too low  Auto-tuning  Tuning parameters are adjusted automatically by the MCMC procedure to reach a target proposal rate
  • 68. cold chain heated chain Metropolis- coupled Markov chain Monte Carlo a. k. a. MCMCMC a. k. a. (MC)3
  • 80. Incremental Heating ( )iT λ+= 1/1 { }1,...,1,0 −= ni ( ) ( ) ( ) ( ) 62.0 71.0 83.0 00.1 |62.03 |71.02 |83.01 |00.10 Distr. Xf Xf Xf Xf Ti θ θ θ θ T is temperature, λ is heating coefficient Example for λ = 0.2: cold chain heated chains
  • 83. Models, models, models  Alignment-free models  Heterogeneity in substitution rates and stationary frequencies across sites and lineages  Relaxed clock models  Models for morphology and biogeography  Sampling across model space, e.g. GTR space and partition space  Models of dependence across sites according to 3D structure of proteins  Positive selection models  Aminoacid models  Models for population genetics and phylogeography
  • 84. Bayes’ Rule f (θ | D) = f (θ) f (D |θ) f (θ) f (D |θ) dθ∫ = f (θ) f (D |θ) f (D) Marginal likelihood (of the data) f (θ | D,M) = f (θ | M) f (D |θ,M) f (D | M) We have implicitly conditioned on a model:
  • 85. Bayesian Model Choice f M1( ) f D | M1( ) f M0( ) f D | M0( ) Posterior model odds: B10 = f D | M1( ) f D | M0( ) Bayes factor:
  • 86. Bayesian Model Choice  The normalizing constant in Bayes’ theorem, the marginal likelihood of the data, f(D) or f(D|M), can be used for model choice  f(D|M) can be estimated by taking the harmonic mean of the likelihood values from the MCMC run. Thermodynamic integration and stepping-stone sampling are computationally more complex but more accurate methods  Any models can be compared: nested, non-nested, data-derived; it is just a probability comparison  No correction for number of parameters  Can prefer a simpler model over a more complex model  Critical values in Kass and Raftery (1997)
  • 87. Simple Model Wins (from Lewis, 2008)
  • 88. Bayes Factor Comparisons Interpretation of the Bayes factor 2ln(B10) B10 Evidence against M0 0 to 2 1 to 3 Not worth more than a bare mention 2 to 6 3 to 20 Positive 6 to 10 20 to 150 Strong > 10 > 150 Very strong
  • 89. Bayesian Software  Model testing  ModelTest  MrModelTest  MrAIC  Convergence diagnostics  AWTY  Tracer  Phylogenetic inference  MrBayes  BEAST  BayesPhylogenies  PhyloBayes  Phycas  BAMBE  RevBayes  Specialized inference  PHASE  BAliPhy  BayesTraits  Badger  BEST  *BEAST  CoEvol  Tree drawing  TreeView  FigTree
  • 90. Listening to lectures, after a certain age, diverts the mind too much from its creative pursuits. Any scientist who attends too many lectures and uses her own brain too little falls into lazy habits of thinking. after Albert Einstein