SlideShare une entreprise Scribd logo
1  sur  20
Fishing for Errors
 Extending the Treatment of Errors in the
        Fisher Matrix Formalism



         !
      ded              Moumita Aich
     a                 Mohammed El-Mufti

 elo                   Eli Kasai
                       Brian Nord
R                      Marina Seikel
                       Sahba Yahya
                       Alan Heavens
                       Bruce Bassett
     12 April 2012
Fisher Power!
The Fisher Matrix forecasts astronomical constraints
for model parameters: it can be used to predict                 68% confidence that parameters
confidence contours cosmological parameter!                         lie within the blue (dashed)
                                                                                       contour.
Ingredients required:
     a parametrized, physical model for an




                                                           θA
     observable [ y = f(x;θ) ]
     a set of errors in the observable [ σy ]

Example: Baryon Acoustic Oscillations [e.g., Big BOSS]            99% confidence
    Model and parameters: dA(z; H0, Ωk) and
    H(z; H0, Ωk, Ωm)
    a set of errors in the observable variables: σdA, σH
                                                                            θB
Fisher Power!
The Fisher Matrix forecasts astronomical constraints
for model parameters: it can be used to predict                          68% confidence that parameters
confidence contours cosmological parameter!                                  lie within the blue (dashed)
                                                                                                contour.
Ingredients required:
     a parametrized, physical model for an




                                                                 θA
     observable [ y = f(x;θ) ]
     a set of errors in the observable [ σy ]

Example: Baryon Acoustic Oscillations [e.g., Big BOSS]                        99% confidence
    Model and parameters: dA(z; H0, Ωk) and
    H(z; H0, Ωk, Ωm)
    a set of errors in the observable variables: σdA, σH
                                                                                        θB

General Formalism:
           ⌧ 2
               @ lnL         [definition]
 FAB =                                                               y = f (x; ✓)                  model


                                                                 {
             @✓A @✓B
                                   ✓                         ◆       ✓                  ◆ covariance of
          @f (x)     1 @f (x)  1           1 @C      1 @C                2
  FAB   =        C            + Tr C             C                       y1        0          data variable
           @✓A          @✓B    2             @✓A       @✓B                         2
                                                                         0         yN               errors
Fisher Power!
The Fisher Matrix forecasts astronomical constraints
for model parameters: it can be used to predict                          68% confidence that parameters
confidence contours cosmological parameter!                                  lie within the blue (dashed)
                                                                                                contour.
Ingredients required:
     a parametrized, physical model for an




                                                                 θA
     observable [ y = f(x;θ) ]
     a set of errors in the observable [ σy ]

Example: Baryon Acoustic Oscillations [e.g., Big BOSS]                        99% confidence
    Model and parameters: dA(z; H0, Ωk) and
    H(z; H0, Ωk, Ωm)
    a set of errors in the observable variables: σdA, σH
                                                                                        θB

General Formalism:
           ⌧ 2
               @ lnL         [definition]
 FAB =                                                               y = f (x; ✓)                  model


                                                                 {
             @✓A @✓B
                                   ✓                         ◆       ✓                  ◆ covariance of
          @f (x)      1 @f (x)  1          1 @C      1 @C                2
  FAB   =        C             + Tr C            C                       y1        0          data variable
           @✓A           @✓B    2            @✓A       @✓B                         2
                                                                         0         yN               errors

                     This work focuses on the covariance matrix, C
Primary Goals and Questions


    •   Will the errors in the independent variable [e.g.,
        redshift] impact predicted constraints on model
        parameters?

    •   What is the impact of the dependent and
        independent variables being correlated?

    •   Can we account for multi-peaked distributions in the
        independent variable? --e.g., double-peaked
        distributions for photometric redshifts. [Next time]
Primary Goals and Questions

✓   2
              ◆
         0
    y1
    0    2        •   Will the errors in the independent variable [e.g.,
         yN           redshift] impact predicted constraints on model
                      parameters?
✓             ◆
    2
    xx   0        •   What is the impact of the dependent and
    0    2
         yy           independent variables being correlated?

                  •   Can we account for multi-peaked distributions in the
✓             ◆       independent variable? --e.g., double-peaked
    2    2
    xx   xy           distributions for photometric redshifts. [Next time]
    2    2
    yx   yy
Outline



•   The motivation: Enhance FM predictions with more
    comprehensive error accounting?

•   General approach: Derive FM from scratch.

•   Introducing Covariances in Observables.
Re-Derive FM From first principles
  Setup [Observables]
     Measured Observables:         {Xi } , {Yi } ; i = 1, . . . N
     True values of Observables:   {xi } , {yi } ; i = 1, . . . N
  Setup [Model]
    Errors: X and Y are gaussian-distributed about true values, x and y, respectively.
     Mean Model: x and y are related by         y = f (x)
Re-Derive FM From first principles
  Setup [Observables]
      Measured Observables:                 {Xi } , {Yi } ; i = 1, . . . N
      True values of Observables:           {xi } , {yi } ; i = 1, . . . N
  Setup [Model]
     Errors: X and Y are gaussian-distributed about true values, x and y, respectively.
      Mean Model: x and y are related by                  y = f (x)
Calculate Likelihood
 L(✓) = p(X, Y |✓) / p(✓|X, Y )                                     (Likelihood of a parameter via Bayes’ Thm.)
      Z
 L=       p(X, Y |x, y, ✓)p(y|x, ✓)p(x|✓) dN x dN y                    (Unpack all the conditional probabilities)


       Let y always be the function of x:       Assume uniform distribution for x:

          p(y|x, ✓) = (y        f (x))                 xi ⇠ U = p(xi |✓)
Re-Derive FM From first principles
  Setup [Observables]
      Measured Observables:                 {Xi } , {Yi } ; i = 1, . . . N
      True values of Observables:           {xi } , {yi } ; i = 1, . . . N
  Setup [Model]
     Errors: X and Y are gaussian-distributed about true values, x and y, respectively.
      Mean Model: x and y are related by                  y = f (x)
Calculate Likelihood
 L(✓) = p(X, Y |✓) / p(✓|X, Y )                                     (Likelihood of a parameter via Bayes’ Thm.)
      Z
 L=       p(X, Y |x, y, ✓)p(y|x, ✓)p(x|✓) dN x dN y                    (Unpack all the conditional probabilities)


       Let y always be the function of x:       Assume uniform distribution for x:

          p(y|x, ✓) = (y        f (x))                 xi ⇠ U = p(xi |✓)
Re-Derive FM From first principles
  Setup [Observables]
      Measured Observables:                  {Xi } , {Yi } ; i = 1, . . . N
      True values of Observables:            {xi } , {yi } ; i = 1, . . . N
  Setup [Model]
     Errors: X and Y are gaussian-distributed about true values, x and y, respectively.
      Mean Model: x and y are related by                   y = f (x)
Calculate Likelihood
 L(✓) = p(X, Y |✓) / p(✓|X, Y )                                      (Likelihood of a parameter via Bayes’ Thm.)
      Z
 L=       p(X, Y |x, y, ✓)p(y|x, ✓)p(x|✓) dN x dN y                     (Unpack all the conditional probabilities)


       Let y always be the function of x:        Assume uniform distribution for x:

          p(y|x, ✓) = (y        f (x))                  xi ⇠ U = p(xi |✓)
                                         Z
                                                                     N        N
                        ) L=                 p(X, Y |x, f , ✓) d x d y
Calculate Likelihood (II.)
      Z
                                                The distributions in X and Y are Normal,
 L = p(X, Y |x, f , ✓) dN x dN y                but not generally analytically soluble.


Therefore, we Taylor-expand the model...
assume that it is linear across the width of   f (xi ) ! f ⇤ (xi ) = f (Xi ) + (xi   Xi )f 0 (Xi )
gaussian distribution of x, retaining only
linear terms.:
Calculate Likelihood (II.)
      Z
                                                 The distributions in X and Y are Normal,
 L = p(X, Y |x, f , ✓) dN x dN y                 but not generally analytically soluble.


Therefore, we Taylor-expand the model...
assume that it is linear across the width of    f (xi ) ! f ⇤ (xi ) = f (Xi ) + (xi       Xi )f 0 (Xi )
gaussian distribution of x, retaining only
linear terms.:

                                                     {zi , Zi } = {xi , Xi } for i  N
 Let Z be a 2N-dimensional vector, containing
 both measured and true observables!                 {zi , Zi } = {f ⇤ , Yi } for i > N

                                                     Z                
 This provides the canonical form of the                    1             1
 multi-variate normal distribution:            )L/       p      exp         (Z   z)T C    1
                                                                                              (Z   z)
                                                           detC           2

                                                           ✓                       ◆
With {z,Z} as vectors, the covariance matrix                   CXX         CXY
can be written in block form:                                  CXYT        CY Y
Calculate Likelihood (II.)
      Z
                                                 The distributions in X and Y are Normal,
 L = p(X, Y |x, f , ✓) dN x dN y                 but not generally analytically soluble.


Therefore, we Taylor-expand the model...
assume that it is linear across the width of    f (xi ) ! f ⇤ (xi ) = f (Xi ) + (xi       Xi )f 0 (Xi )
gaussian distribution of x, retaining only
linear terms.:

                                                     {zi , Zi } = {xi , Xi } for i  N
 Let Z be a 2N-dimensional vector, containing
 both measured and true observables!                 {zi , Zi } = {f ⇤ , Yi } for i > N

                                                     Z                
 This provides the canonical form of the                    1             1
 multi-variate normal distribution:            )L/       p      exp         (Z   z)T C    1
                                                                                              (Z   z)
                                                           detC           2

                                                           ✓                       ◆
With {z,Z} as vectors, the covariance matrix                   CXX         CXY
can be written in block form:                                  CXYT        CY Y

                                         Notice that this form natively contains covariance
                                         among X’s among Y’s and between X’s and Y’s
Calculate Likelihood (III.)
       Z              
             1            1
  L/            exp         (Z    z)T C   1
                                              (Z   z)         Evaluating the exponent and simplifying,
           detC           2

                           
          1                      1 eT     1e
   )L/ p      exp                  Y R    Y
         detR                    2



(where R is a function of Cij and f*)              R = CY Y + CXYT T + T CXY + T CXX T
                                                              ✓                  ◆
                                                                  df (x)
                                                   T = diag
                                                                   dx      x=X
Calculate Likelihood (III.)
       Z              
             1            1
  L/            exp         (Z    z)T C   1
                                              (Z   z)         Evaluating the exponent and simplifying,
           detC           2

                           
          1                      1 eT     1e
   )L/ p      exp                  Y R    Y
         detR                    2



(where R is a function of Cij and f*)              R = CY Y + CXYT T + T CXY + T CXX T
                                                              ✓                  ◆
                                                                  df (x)
                                                   T = diag
                                                                   dx      x=X




Result :      Where the x data are irrelevant,
                  i.e., when derivatives of f are zero,
                  i.e., when CXX (or σx) = 0,
              We recover the original form R→ C = CYY
Main
Result :
Main
Result :
    With the help of Tegmark, Taylor and Heavens (1997), R then takes the place of the
    covariance, C, in the canonical formulation:


                         T
                                                   ✓                           ◆
                   @f            1   @f  1                 1 @R        1 @R
           FAB   =    R                 + Tr R                    R
                   @A                @B  2                   @A           @B
Main
Result :
    With the help of Tegmark, Taylor and Heavens (1997), R then takes the place of the
    covariance, C, in the canonical formulation:


                         T
                                                   ✓                           ◆
                   @f            1   @f  1                 1 @R        1 @R
           FAB   =    R                 + Tr R                    R
                   @A                @B  2                   @A           @B




           Even if C does not dependent on the parameters, R does depend on
           them [via f]. The trace term is in general non-zero.
Summary: An Extended Formalism
•   Development of general process for evaluating arbitrary
    model functions to 1st order in the FM formalism.

•   Incorporation of correlated errors among observables.




                     Next Steps
     •   Application: Double-peaked Error distributions

     •   Check: Compare to MCMC

     •   Cosmological Application

     •   Incorporate into Fisher4Cast

Contenu connexe

Tendances

ABC short course: survey chapter
ABC short course: survey chapterABC short course: survey chapter
ABC short course: survey chapterChristian Robert
 
Calculus Cheat Sheet All
Calculus Cheat Sheet AllCalculus Cheat Sheet All
Calculus Cheat Sheet AllMoe Han
 
Lesson 19: The Mean Value Theorem (slides)
Lesson 19: The Mean Value Theorem (slides)Lesson 19: The Mean Value Theorem (slides)
Lesson 19: The Mean Value Theorem (slides)Matthew Leingang
 
Multiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximationsMultiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximationsChristian Robert
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixturesChristian Robert
 
Summary of Integration Methods
Summary of Integration MethodsSummary of Integration Methods
Summary of Integration MethodsSilvius
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationChristian Robert
 
Lesson 20: Derivatives and the Shapes of Curves (slides)
Lesson 20: Derivatives and the Shapes of Curves (slides)Lesson 20: Derivatives and the Shapes of Curves (slides)
Lesson 20: Derivatives and the Shapes of Curves (slides)Matthew Leingang
 
Inference in generative models using the Wasserstein distance [[INI]
Inference in generative models using the Wasserstein distance [[INI]Inference in generative models using the Wasserstein distance [[INI]
Inference in generative models using the Wasserstein distance [[INI]Christian Robert
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceChristian Robert
 
Approximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forestsApproximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forestsChristian Robert
 
Common derivatives integrals_reduced
Common derivatives integrals_reducedCommon derivatives integrals_reduced
Common derivatives integrals_reducedKyro Fitkry
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Christian Robert
 
ABC short course: introduction chapters
ABC short course: introduction chaptersABC short course: introduction chapters
ABC short course: introduction chaptersChristian Robert
 

Tendances (20)

ABC short course: survey chapter
ABC short course: survey chapterABC short course: survey chapter
ABC short course: survey chapter
 
Calculus Cheat Sheet All
Calculus Cheat Sheet AllCalculus Cheat Sheet All
Calculus Cheat Sheet All
 
Lesson 19: The Mean Value Theorem (slides)
Lesson 19: The Mean Value Theorem (slides)Lesson 19: The Mean Value Theorem (slides)
Lesson 19: The Mean Value Theorem (slides)
 
Multiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximationsMultiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximations
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
 
Mvtword
MvtwordMvtword
Mvtword
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
 
Summary of Integration Methods
Summary of Integration MethodsSummary of Integration Methods
Summary of Integration Methods
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimation
 
asymptotics of ABC
asymptotics of ABCasymptotics of ABC
asymptotics of ABC
 
Lesson 20: Derivatives and the Shapes of Curves (slides)
Lesson 20: Derivatives and the Shapes of Curves (slides)Lesson 20: Derivatives and the Shapes of Curves (slides)
Lesson 20: Derivatives and the Shapes of Curves (slides)
 
Inference in generative models using the Wasserstein distance [[INI]
Inference in generative models using the Wasserstein distance [[INI]Inference in generative models using the Wasserstein distance [[INI]
Inference in generative models using the Wasserstein distance [[INI]
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
 
Approximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forestsApproximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forests
 
ABC workshop: 17w5025
ABC workshop: 17w5025ABC workshop: 17w5025
ABC workshop: 17w5025
 
Common derivatives integrals_reduced
Common derivatives integrals_reducedCommon derivatives integrals_reduced
Common derivatives integrals_reduced
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
 
ABC short course: introduction chapters
ABC short course: introduction chaptersABC short course: introduction chapters
ABC short course: introduction chapters
 

En vedette

Results from questionnaire
Results from questionnaireResults from questionnaire
Results from questionnairekelseyjowink
 
Analysis for existing music magazines
Analysis for existing music magazinesAnalysis for existing music magazines
Analysis for existing music magazineskelseyjowink
 
Conventions of a short film
Conventions of a short filmConventions of a short film
Conventions of a short filmkelseyjowink
 
In what ways does your media product use
In what ways does your media product useIn what ways does your media product use
In what ways does your media product usekelseyjowink
 

En vedette (10)

Results from questionnaire
Results from questionnaireResults from questionnaire
Results from questionnaire
 
One touch drawing
One touch drawingOne touch drawing
One touch drawing
 
Analysis for existing music magazines
Analysis for existing music magazinesAnalysis for existing music magazines
Analysis for existing music magazines
 
Conventions of a short film
Conventions of a short filmConventions of a short film
Conventions of a short film
 
In what ways does your media product use
In what ways does your media product useIn what ways does your media product use
In what ways does your media product use
 
Tp # 4 noe
Tp # 4 noeTp # 4 noe
Tp # 4 noe
 
Searching for Supernovae
Searching for SupernovaeSearching for Supernovae
Searching for Supernovae
 
Target power density
Target power densityTarget power density
Target power density
 
Weekly plan media
Weekly plan mediaWeekly plan media
Weekly plan media
 
Cognitivism
CognitivismCognitivism
Cognitivism
 

Similaire à Fishermatrix extended ctics_reloaded

Gauge Systems With Noncommutative Phase Space
Gauge Systems With Noncommutative Phase SpaceGauge Systems With Noncommutative Phase Space
Gauge Systems With Noncommutative Phase Spacevcuesta
 
Gauge Systems With Noncommutative Phase Space
Gauge Systems With Noncommutative Phase SpaceGauge Systems With Noncommutative Phase Space
Gauge Systems With Noncommutative Phase Spaceguest9fa195
 
MATHEON Center Days: Index determination and structural analysis using Algori...
MATHEON Center Days: Index determination and structural analysis using Algori...MATHEON Center Days: Index determination and structural analysis using Algori...
MATHEON Center Days: Index determination and structural analysis using Algori...Dagmar Monett
 
Algebras for programming languages
Algebras for programming languagesAlgebras for programming languages
Algebras for programming languagesYoshihiro Mizoguchi
 
ABC and empirical likelihood
ABC and empirical likelihoodABC and empirical likelihood
ABC and empirical likelihoodChristian Robert
 
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013Christian Robert
 
Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02Deb Roy
 
2015 CMS Winter Meeting Poster
2015 CMS Winter Meeting Poster2015 CMS Winter Meeting Poster
2015 CMS Winter Meeting PosterChelsea Battell
 
HETEROGENEOUS CLUTTER MODEL FOR HIGH RESOLUTION POLARIMETRIC SAR DATA PROCESSING
HETEROGENEOUS CLUTTER MODEL FOR HIGH RESOLUTION POLARIMETRIC SAR DATA PROCESSINGHETEROGENEOUS CLUTTER MODEL FOR HIGH RESOLUTION POLARIMETRIC SAR DATA PROCESSING
HETEROGENEOUS CLUTTER MODEL FOR HIGH RESOLUTION POLARIMETRIC SAR DATA PROCESSINGgrssieee
 
Quantum fields on the de sitter spacetime - Ion Cotaescu
Quantum fields on the de sitter spacetime - Ion CotaescuQuantum fields on the de sitter spacetime - Ion Cotaescu
Quantum fields on the de sitter spacetime - Ion CotaescuSEENET-MTP
 
ABC short course: model choice chapter
ABC short course: model choice chapterABC short course: model choice chapter
ABC short course: model choice chapterChristian Robert
 
New Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial SectorNew Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial SectorSSA KPI
 
Approximate Bayesian Computation on GPUs
Approximate Bayesian Computation on GPUsApproximate Bayesian Computation on GPUs
Approximate Bayesian Computation on GPUsMichael Stumpf
 
Slides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processingSlides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processingFrank Nielsen
 
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...Chiheb Ben Hammouda
 
NIPS2008: tutorial: statistical models of visual images
NIPS2008: tutorial: statistical models of visual imagesNIPS2008: tutorial: statistical models of visual images
NIPS2008: tutorial: statistical models of visual imageszukun
 

Similaire à Fishermatrix extended ctics_reloaded (20)

Gauge Systems With Noncommutative Phase Space
Gauge Systems With Noncommutative Phase SpaceGauge Systems With Noncommutative Phase Space
Gauge Systems With Noncommutative Phase Space
 
Gauge Systems With Noncommutative Phase Space
Gauge Systems With Noncommutative Phase SpaceGauge Systems With Noncommutative Phase Space
Gauge Systems With Noncommutative Phase Space
 
MATHEON Center Days: Index determination and structural analysis using Algori...
MATHEON Center Days: Index determination and structural analysis using Algori...MATHEON Center Days: Index determination and structural analysis using Algori...
MATHEON Center Days: Index determination and structural analysis using Algori...
 
Algebras for programming languages
Algebras for programming languagesAlgebras for programming languages
Algebras for programming languages
 
ABC and empirical likelihood
ABC and empirical likelihoodABC and empirical likelihood
ABC and empirical likelihood
 
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
 
Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02Discussion cabras-robert-130323171455-phpapp02
Discussion cabras-robert-130323171455-phpapp02
 
Midterm II Review
Midterm II ReviewMidterm II Review
Midterm II Review
 
2015 CMS Winter Meeting Poster
2015 CMS Winter Meeting Poster2015 CMS Winter Meeting Poster
2015 CMS Winter Meeting Poster
 
MUMS: Bayesian, Fiducial, and Frequentist Conference - Model Selection in the...
MUMS: Bayesian, Fiducial, and Frequentist Conference - Model Selection in the...MUMS: Bayesian, Fiducial, and Frequentist Conference - Model Selection in the...
MUMS: Bayesian, Fiducial, and Frequentist Conference - Model Selection in the...
 
HETEROGENEOUS CLUTTER MODEL FOR HIGH RESOLUTION POLARIMETRIC SAR DATA PROCESSING
HETEROGENEOUS CLUTTER MODEL FOR HIGH RESOLUTION POLARIMETRIC SAR DATA PROCESSINGHETEROGENEOUS CLUTTER MODEL FOR HIGH RESOLUTION POLARIMETRIC SAR DATA PROCESSING
HETEROGENEOUS CLUTTER MODEL FOR HIGH RESOLUTION POLARIMETRIC SAR DATA PROCESSING
 
MUMS: Transition & SPUQ Workshop - Some Strategies to Quantify Uncertainty fo...
MUMS: Transition & SPUQ Workshop - Some Strategies to Quantify Uncertainty fo...MUMS: Transition & SPUQ Workshop - Some Strategies to Quantify Uncertainty fo...
MUMS: Transition & SPUQ Workshop - Some Strategies to Quantify Uncertainty fo...
 
Quantum fields on the de sitter spacetime - Ion Cotaescu
Quantum fields on the de sitter spacetime - Ion CotaescuQuantum fields on the de sitter spacetime - Ion Cotaescu
Quantum fields on the de sitter spacetime - Ion Cotaescu
 
2018 MUMS Fall Course - Introduction to statistical and mathematical model un...
2018 MUMS Fall Course - Introduction to statistical and mathematical model un...2018 MUMS Fall Course - Introduction to statistical and mathematical model un...
2018 MUMS Fall Course - Introduction to statistical and mathematical model un...
 
ABC short course: model choice chapter
ABC short course: model choice chapterABC short course: model choice chapter
ABC short course: model choice chapter
 
New Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial SectorNew Mathematical Tools for the Financial Sector
New Mathematical Tools for the Financial Sector
 
Approximate Bayesian Computation on GPUs
Approximate Bayesian Computation on GPUsApproximate Bayesian Computation on GPUs
Approximate Bayesian Computation on GPUs
 
Slides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processingSlides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processing
 
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
Hierarchical Deterministic Quadrature Methods for Option Pricing under the Ro...
 
NIPS2008: tutorial: statistical models of visual images
NIPS2008: tutorial: statistical models of visual imagesNIPS2008: tutorial: statistical models of visual images
NIPS2008: tutorial: statistical models of visual images
 

Dernier

Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...apidays
 
Ransomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfRansomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfOverkill Security
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...Martijn de Jong
 
[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdfSandro Moreira
 
Spring Boot vs Quarkus the ultimate battle - DevoxxUK
Spring Boot vs Quarkus the ultimate battle - DevoxxUKSpring Boot vs Quarkus the ultimate battle - DevoxxUK
Spring Boot vs Quarkus the ultimate battle - DevoxxUKJago de Vreede
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businesspanagenda
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Victor Rentea
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingEdi Saputra
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Orbitshub
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FMESafe Software
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...apidays
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamUiPathCommunity
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processorsdebabhi2
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Jeffrey Haguewood
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoffsammart93
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century educationjfdjdjcjdnsjd
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...Zilliz
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MIND CTI
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Victor Rentea
 

Dernier (20)

Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
 
Ransomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfRansomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdf
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf
 
Spring Boot vs Quarkus the ultimate battle - DevoxxUK
Spring Boot vs Quarkus the ultimate battle - DevoxxUKSpring Boot vs Quarkus the ultimate battle - DevoxxUK
Spring Boot vs Quarkus the ultimate battle - DevoxxUK
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
 

Fishermatrix extended ctics_reloaded

  • 1. Fishing for Errors Extending the Treatment of Errors in the Fisher Matrix Formalism ! ded Moumita Aich a Mohammed El-Mufti elo Eli Kasai Brian Nord R Marina Seikel Sahba Yahya Alan Heavens Bruce Bassett 12 April 2012
  • 2. Fisher Power! The Fisher Matrix forecasts astronomical constraints for model parameters: it can be used to predict 68% confidence that parameters confidence contours cosmological parameter! lie within the blue (dashed) contour. Ingredients required: a parametrized, physical model for an θA observable [ y = f(x;θ) ] a set of errors in the observable [ σy ] Example: Baryon Acoustic Oscillations [e.g., Big BOSS] 99% confidence Model and parameters: dA(z; H0, Ωk) and H(z; H0, Ωk, Ωm) a set of errors in the observable variables: σdA, σH θB
  • 3. Fisher Power! The Fisher Matrix forecasts astronomical constraints for model parameters: it can be used to predict 68% confidence that parameters confidence contours cosmological parameter! lie within the blue (dashed) contour. Ingredients required: a parametrized, physical model for an θA observable [ y = f(x;θ) ] a set of errors in the observable [ σy ] Example: Baryon Acoustic Oscillations [e.g., Big BOSS] 99% confidence Model and parameters: dA(z; H0, Ωk) and H(z; H0, Ωk, Ωm) a set of errors in the observable variables: σdA, σH θB General Formalism: ⌧ 2 @ lnL [definition] FAB = y = f (x; ✓) model { @✓A @✓B ✓ ◆ ✓ ◆ covariance of @f (x) 1 @f (x) 1 1 @C 1 @C 2 FAB = C + Tr C C y1 0 data variable @✓A @✓B 2 @✓A @✓B 2 0 yN errors
  • 4. Fisher Power! The Fisher Matrix forecasts astronomical constraints for model parameters: it can be used to predict 68% confidence that parameters confidence contours cosmological parameter! lie within the blue (dashed) contour. Ingredients required: a parametrized, physical model for an θA observable [ y = f(x;θ) ] a set of errors in the observable [ σy ] Example: Baryon Acoustic Oscillations [e.g., Big BOSS] 99% confidence Model and parameters: dA(z; H0, Ωk) and H(z; H0, Ωk, Ωm) a set of errors in the observable variables: σdA, σH θB General Formalism: ⌧ 2 @ lnL [definition] FAB = y = f (x; ✓) model { @✓A @✓B ✓ ◆ ✓ ◆ covariance of @f (x) 1 @f (x) 1 1 @C 1 @C 2 FAB = C + Tr C C y1 0 data variable @✓A @✓B 2 @✓A @✓B 2 0 yN errors This work focuses on the covariance matrix, C
  • 5. Primary Goals and Questions • Will the errors in the independent variable [e.g., redshift] impact predicted constraints on model parameters? • What is the impact of the dependent and independent variables being correlated? • Can we account for multi-peaked distributions in the independent variable? --e.g., double-peaked distributions for photometric redshifts. [Next time]
  • 6. Primary Goals and Questions ✓ 2 ◆ 0 y1 0 2 • Will the errors in the independent variable [e.g., yN redshift] impact predicted constraints on model parameters? ✓ ◆ 2 xx 0 • What is the impact of the dependent and 0 2 yy independent variables being correlated? • Can we account for multi-peaked distributions in the ✓ ◆ independent variable? --e.g., double-peaked 2 2 xx xy distributions for photometric redshifts. [Next time] 2 2 yx yy
  • 7. Outline • The motivation: Enhance FM predictions with more comprehensive error accounting? • General approach: Derive FM from scratch. • Introducing Covariances in Observables.
  • 8. Re-Derive FM From first principles Setup [Observables] Measured Observables: {Xi } , {Yi } ; i = 1, . . . N True values of Observables: {xi } , {yi } ; i = 1, . . . N Setup [Model] Errors: X and Y are gaussian-distributed about true values, x and y, respectively. Mean Model: x and y are related by y = f (x)
  • 9. Re-Derive FM From first principles Setup [Observables] Measured Observables: {Xi } , {Yi } ; i = 1, . . . N True values of Observables: {xi } , {yi } ; i = 1, . . . N Setup [Model] Errors: X and Y are gaussian-distributed about true values, x and y, respectively. Mean Model: x and y are related by y = f (x) Calculate Likelihood L(✓) = p(X, Y |✓) / p(✓|X, Y ) (Likelihood of a parameter via Bayes’ Thm.) Z L= p(X, Y |x, y, ✓)p(y|x, ✓)p(x|✓) dN x dN y (Unpack all the conditional probabilities) Let y always be the function of x: Assume uniform distribution for x: p(y|x, ✓) = (y f (x)) xi ⇠ U = p(xi |✓)
  • 10. Re-Derive FM From first principles Setup [Observables] Measured Observables: {Xi } , {Yi } ; i = 1, . . . N True values of Observables: {xi } , {yi } ; i = 1, . . . N Setup [Model] Errors: X and Y are gaussian-distributed about true values, x and y, respectively. Mean Model: x and y are related by y = f (x) Calculate Likelihood L(✓) = p(X, Y |✓) / p(✓|X, Y ) (Likelihood of a parameter via Bayes’ Thm.) Z L= p(X, Y |x, y, ✓)p(y|x, ✓)p(x|✓) dN x dN y (Unpack all the conditional probabilities) Let y always be the function of x: Assume uniform distribution for x: p(y|x, ✓) = (y f (x)) xi ⇠ U = p(xi |✓)
  • 11. Re-Derive FM From first principles Setup [Observables] Measured Observables: {Xi } , {Yi } ; i = 1, . . . N True values of Observables: {xi } , {yi } ; i = 1, . . . N Setup [Model] Errors: X and Y are gaussian-distributed about true values, x and y, respectively. Mean Model: x and y are related by y = f (x) Calculate Likelihood L(✓) = p(X, Y |✓) / p(✓|X, Y ) (Likelihood of a parameter via Bayes’ Thm.) Z L= p(X, Y |x, y, ✓)p(y|x, ✓)p(x|✓) dN x dN y (Unpack all the conditional probabilities) Let y always be the function of x: Assume uniform distribution for x: p(y|x, ✓) = (y f (x)) xi ⇠ U = p(xi |✓) Z N N ) L= p(X, Y |x, f , ✓) d x d y
  • 12. Calculate Likelihood (II.) Z The distributions in X and Y are Normal, L = p(X, Y |x, f , ✓) dN x dN y but not generally analytically soluble. Therefore, we Taylor-expand the model... assume that it is linear across the width of f (xi ) ! f ⇤ (xi ) = f (Xi ) + (xi Xi )f 0 (Xi ) gaussian distribution of x, retaining only linear terms.:
  • 13. Calculate Likelihood (II.) Z The distributions in X and Y are Normal, L = p(X, Y |x, f , ✓) dN x dN y but not generally analytically soluble. Therefore, we Taylor-expand the model... assume that it is linear across the width of f (xi ) ! f ⇤ (xi ) = f (Xi ) + (xi Xi )f 0 (Xi ) gaussian distribution of x, retaining only linear terms.: {zi , Zi } = {xi , Xi } for i  N Let Z be a 2N-dimensional vector, containing both measured and true observables! {zi , Zi } = {f ⇤ , Yi } for i > N Z  This provides the canonical form of the 1 1 multi-variate normal distribution: )L/ p exp (Z z)T C 1 (Z z) detC 2 ✓ ◆ With {z,Z} as vectors, the covariance matrix CXX CXY can be written in block form: CXYT CY Y
  • 14. Calculate Likelihood (II.) Z The distributions in X and Y are Normal, L = p(X, Y |x, f , ✓) dN x dN y but not generally analytically soluble. Therefore, we Taylor-expand the model... assume that it is linear across the width of f (xi ) ! f ⇤ (xi ) = f (Xi ) + (xi Xi )f 0 (Xi ) gaussian distribution of x, retaining only linear terms.: {zi , Zi } = {xi , Xi } for i  N Let Z be a 2N-dimensional vector, containing both measured and true observables! {zi , Zi } = {f ⇤ , Yi } for i > N Z  This provides the canonical form of the 1 1 multi-variate normal distribution: )L/ p exp (Z z)T C 1 (Z z) detC 2 ✓ ◆ With {z,Z} as vectors, the covariance matrix CXX CXY can be written in block form: CXYT CY Y Notice that this form natively contains covariance among X’s among Y’s and between X’s and Y’s
  • 15. Calculate Likelihood (III.) Z  1 1 L/ exp (Z z)T C 1 (Z z) Evaluating the exponent and simplifying, detC 2  1 1 eT 1e )L/ p exp Y R Y detR 2 (where R is a function of Cij and f*) R = CY Y + CXYT T + T CXY + T CXX T ✓ ◆ df (x) T = diag dx x=X
  • 16. Calculate Likelihood (III.) Z  1 1 L/ exp (Z z)T C 1 (Z z) Evaluating the exponent and simplifying, detC 2  1 1 eT 1e )L/ p exp Y R Y detR 2 (where R is a function of Cij and f*) R = CY Y + CXYT T + T CXY + T CXX T ✓ ◆ df (x) T = diag dx x=X Result : Where the x data are irrelevant, i.e., when derivatives of f are zero, i.e., when CXX (or σx) = 0, We recover the original form R→ C = CYY
  • 18. Main Result : With the help of Tegmark, Taylor and Heavens (1997), R then takes the place of the covariance, C, in the canonical formulation: T ✓ ◆ @f 1 @f 1 1 @R 1 @R FAB = R + Tr R R @A @B 2 @A @B
  • 19. Main Result : With the help of Tegmark, Taylor and Heavens (1997), R then takes the place of the covariance, C, in the canonical formulation: T ✓ ◆ @f 1 @f 1 1 @R 1 @R FAB = R + Tr R R @A @B 2 @A @B Even if C does not dependent on the parameters, R does depend on them [via f]. The trace term is in general non-zero.
  • 20. Summary: An Extended Formalism • Development of general process for evaluating arbitrary model functions to 1st order in the FM formalism. • Incorporation of correlated errors among observables. Next Steps • Application: Double-peaked Error distributions • Check: Compare to MCMC • Cosmological Application • Incorporate into Fisher4Cast

Notes de l'éditeur

  1. \n
  2. Define the fisher matrix and it the pieces.\nstart motivation for ...\n1) why it’s used\n2) why we want to modify: note where the errors enter\n\n\n given a model, it propagates errors from data onto model parameter estimates.\n
  3. Define the fisher matrix and it the pieces.\nstart motivation for ...\n1) why it’s used\n2) why we want to modify: note where the errors enter\n\n\n given a model, it propagates errors from data onto model parameter estimates.\n
  4. mention Trotta and previous works, and propagation of error\n
  5. \n
  6. Step through this derivation, noting the key features\n\n\n
  7. Step through this derivation, noting the key features\n\n\n
  8. Step through this derivation, noting the key features\n\n\n
  9. key features: 1) the small variation over the interval allows for taylor expansion\n\n
  10. key features: 1) the small variation over the interval allows for taylor expansion\n\n
  11. Start generally and choose the cases that are of interest; \n\nAre the key elements in the derivation that we should mention? \nWhat are the applications for this? [this is a big question for us, since that still needs to be addressed and workedon for the paper.\n\n\n
  12. \n
  13. \n
  14. \n
  15. Show some basic results from the propagation of error method: show the resulting equation for the FM and the behavior with varying sigma_x or sigma_y (plots!); this will be the naive version starting with the FM from slide 2, where people always start. \n\nExamples from the analytics for the linear case.\n
  16. The naive version also starting with the canonical form of the FM.\n\nExamples with the linear case\n
  17. Did we ever nail down why this difference occurred? Does it simply come from the fact that the MOE doesn’t have the 2nd [covariance] term in the canonical FM eqn?\n
  18. the motivation for going deeper was simply that the methods disagreed?\n