SlideShare une entreprise Scribd logo
1  sur  33
Télécharger pour lire hors ligne
Lesson 31
  First Order, Higher Dimensional Difference
                   Equations

                          Math 20


                       April 30, 2007


Announcements
   PS 12 due Wednesday, May 2
   MT III Friday, May 4 in SC Hall A
   Final Exam: Friday, May 25 at 9:15am, Boylston 110 (Fong
   Auditorium)
Recap

Higher dimensional linear systems
   Examples
        Markov Chains
        Population Dynamics
   Solution

Qualitative Analysis
  Diagonal systems
  Examples

Higher dimensional nonlinear
one-dimensional linear difference equations
   Fact
   The solution to the inhomogeneous difference equation

                               yk+1 = ayk + b

   (with a = 1) has solution

                                      b     b
                     yk = ak y0 −        +
                                     1−a   1−a

   Please try not to memorize this. When a and b have actual
   values, it’s either to follow this process:
    1. Start with ak times an undetermined parameter c (this
       satisfies the homogenized equation)
    2. Find the equilibrium value y∗ .
    3. Add the two and pick c to match y0 when k = 0.
Nonlinear equations

                      sl
                           op
                             e
                                 =
                            −
 Fact                        1
                      slo
 The equilibrium         pe
                            =g                     y0
 point y∗ of the
                               (y
                                 ∗)
 nonlinear
                                              y2
 difference
 equation
 yk+1 = g(yk ) is                        y1
 stable if
                                     1
 |g (yk )| < 1.                  =
                             e
                           op
                      sl
Recap

Higher dimensional linear systems
   Examples
        Markov Chains
        Population Dynamics
   Solution

Qualitative Analysis
  Diagonal systems
  Examples

Higher dimensional nonlinear
Let’s kick it up a notch and look at the multivariable, linear,
homogeneous difference equation

                         y(k + 1) = Ay(k )

(we move the index into parentheses to allow y(k ) to have
coordinates and to avoid writing yk,i .)
Skipping class



   Example
   This example was a Markov chain with transition matrix

                                 0.7 0.8
                           A=
                                 0.3 0.2

   Then the probability of going or skipping on day k satisfies the
   equation
                           p(k + 1) = Ap(k )
Example
Female lobsters have more eggs each season the longer they
live. For this reason, it is illegal to keep a lobster that has laid
eggs.
Let yi be the number of lobsters in a fishery which are i years
alive. Then the difference equation might have the simplified
form                                              
                              0 100 400 700
                          0.1 0           0     0
              y(k + 1) =                           y(k)
                           0 0.3          0     0
                              0     0     0.9    0
Mmmm. . . Lobster
Formal solution


                  y(1) = Ay(0)
                  y(2) = Ay(1) = A2 y(0)
                  y(3) = Ay(2) = A3 y(0)


   So
Formal solution


                       y(1) = Ay(0)
                       y(2) = Ay(1) = A2 y(0)
                       y(3) = Ay(2) = A3 y(0)


   So
   Fact
   The solution to the homogeneous system of linear difference
   equations y(k + 1) = Ay(k) is

                           y(k) = Ak y(0)
Flop count




      To multiply two n × n matrices takes n3 (n − 1) additions or
      multiplications (flop=floating point operation)
Flop count




      To multiply two n × n matrices takes n3 (n − 1) additions or
      multiplications (flop=floating point operation)
      So finding Ak takes about n4k flops!
Now what?
  Suppose v is an eigenvector of A with eigenvalue λ . Then the
  solution to the problem

                   y(k + 1) = Ay(k),   y(0) = v

  is
Now what?
  Suppose v is an eigenvector of A with eigenvalue λ . Then the
  solution to the problem

                   y(k + 1) = Ay(k),       y(0) = v

  is
                           y(k ) = λ k v
Now what?
  Suppose v is an eigenvector of A with eigenvalue λ . Then the
  solution to the problem

                   y(k + 1) = Ay(k),         y(0) = v

  is
                             y(k ) = λ k v
  Suppose
                  y(0) = c1 v1 + c2 v2 + · · · + cm vm
Now what?
  Suppose v is an eigenvector of A with eigenvalue λ . Then the
  solution to the problem

                    y(k + 1) = Ay(k),         y(0) = v

  is
                              y(k ) = λ k v
  Suppose
                  y(0) = c1 v1 + c2 v2 + · · · + cm vm
  Then

              Ay(0) = c1 λ1 v1 + c2 λ2 v2 + · · · + cm λm vm
            A2 y(0) = c1 λ1 v1 + c2 λ2 v2 + · · · + cm λm vm
                          2          2                  2
Now what?
  Suppose v is an eigenvector of A with eigenvalue λ . Then the
  solution to the problem

                    y(k + 1) = Ay(k),         y(0) = v

  is
                              y(k ) = λ k v
  Suppose
                   y(0) = c1 v1 + c2 v2 + · · · + cm vm
  Then

              Ay(0) = c1 λ1 v1 + c2 λ2 v2 + · · · + cm λm vm
             A2 y(0) = c1 λ1 v1 + c2 λ2 v2 + · · · + cm λm vm
                           2          2                  2




  If A is diagonalizable, we can take m = n and write any initial
  vector as a linear combination of eigenvalues.
The big picture



   Fact
   Let A have a complete system of eigenvalues and eigenvectors
   λ1 , λ2 , . . . , λn and v1 , v2 , . . . , vn . Then the solution to the
   difference equation y(k + 1) = Ay(k) is

            y(k ) = Ak y(0) = c1 λ1 v1 + c2 λ2 v2 + · · · + cn λn vn
                                  k          k                  k


   where c1 , c2 , . . . , cn are chosen to make

                      y(0) = c1 v1 + c2 v2 + · · · + cn vn
Recap

Higher dimensional linear systems
   Examples
        Markov Chains
        Population Dynamics
   Solution

Qualitative Analysis
  Diagonal systems
  Examples

Higher dimensional nonlinear
Iterating diagonal systems




   Consider a 2 × 2 matrix of the form

                                   λ1 0
                             D=
                                   0 λ2

   Then the λ ’s tell the behavior of the system.
Picture in terms of eigenvalues




      λ1 > λ2 > 1: repulsion away from the origin
Picture in terms of eigenvalues




      λ1 > λ2 > 1: repulsion away from the origin
      1 > λ1 > λ2 > 0: attraction to the origin
Picture in terms of eigenvalues




      λ1 > λ2 > 1: repulsion away from the origin
      1 > λ1 > λ2 > 0: attraction to the origin
      λ1 > 1 > λ2 : saddle point
Picture in terms of eigenvalues




      λ1 > λ2 > 1: repulsion away from the origin
      1 > λ1 > λ2 > 0: attraction to the origin
      λ1 > 1 > λ2 : saddle point
Picture in terms of eigenvalues




       λ1 > λ2 > 1: repulsion away from the origin
       1 > λ1 > λ2 > 0: attraction to the origin
       λ1 > 1 > λ2 : saddle point
   For negative eigenvalues just square them and use the above
   results.
Back to skipping class



   Example
   If
                          0.7 0.8
                     A=
                          0.3 0.2
Back to skipping class



   Example
   If
                                0.7 0.8
                           A=
                                0.3 0.2
   The eigenvectors (in decreasing order of absolute value) are
                                −1
    8/11
                                                        1
         with eigenvalue 1 and 12 with eigenvalue − 10 .
    3/11
                                 2
Back to skipping class



   Example
   If
                               0.7 0.8
                         A=
                               0.3 0.2
   The eigenvectors (in decreasing order of absolute value) are
                                −1
    8/11
                                                        1
         with eigenvalue 1 and 12 with eigenvalue − 10 . So the
    3/11
                                  2
                                     8/11
   system converges to a multiple of 3    .
                                      /11
Back to the lobsters



   We had                            
                        0 100 400 700
                      0.1 0   0   0
                    A=               
                       0 0.3  0   0
                        0  0  0.9  0
   The eigenvalues are 3.80293, −2.84895, −0.476993 +
   1.23164i, −0.476993 − 1.23164i and the first eigenvector is
                                       T
    0.999716 0.0233099 0.00489153
Back to the lobsters



   We had                            
                        0 100 400 700
                      0.1 0   0   0
                    A=               
                       0 0.3  0   0
                        0  0  0.9  0
   The eigenvalues are 3.80293, −2.84895, −0.476993 +
   1.23164i, −0.476993 − 1.23164i and the first eigenvector is
                                          T
    0.999716 0.0233099 0.00489153
   The population will grow despite the increased harvesting!
Recap

Higher dimensional linear systems
   Examples
        Markov Chains
        Population Dynamics
   Solution

Qualitative Analysis
  Diagonal systems
  Examples

Higher dimensional nonlinear
The nonlinear case


  Consider now the nonlinear system

                         y(k + 1) = g(y(k)).

  The process is as it was with the one-dimensional nonlinear:
   1. Look for equilibria y∗ with g(y∗ ) = y∗
   2. Linearize about the equilibrium using the matrix

                                           ∂ gi
                          A = Dg(y∗ ) =
                                           ∂ yj

   3. The eigenvalues of A determine the stability of y∗ .

Contenu connexe

Tendances

Emat 213 midterm 1 winter 2006
Emat 213 midterm 1 winter 2006Emat 213 midterm 1 winter 2006
Emat 213 midterm 1 winter 2006
akabaka12
 
Differential equation
Differential equation Differential equation
Differential equation
azyanmdzahri
 
Differential equation.ypm
Differential equation.ypmDifferential equation.ypm
Differential equation.ypm
yogirajpm
 
Introduction to Numerical Methods for Differential Equations
Introduction to Numerical Methods for Differential EquationsIntroduction to Numerical Methods for Differential Equations
Introduction to Numerical Methods for Differential Equations
matthew_henderson
 

Tendances (20)

Automobile 3rd sem aem ppt.2016
Automobile 3rd sem aem ppt.2016Automobile 3rd sem aem ppt.2016
Automobile 3rd sem aem ppt.2016
 
Lesson 25: Unconstrained Optimization I
Lesson 25: Unconstrained Optimization ILesson 25: Unconstrained Optimization I
Lesson 25: Unconstrained Optimization I
 
DIFFERENTIAL EQUATION
DIFFERENTIAL EQUATIONDIFFERENTIAL EQUATION
DIFFERENTIAL EQUATION
 
Ecuaciones diferenciales c1
Ecuaciones diferenciales c1Ecuaciones diferenciales c1
Ecuaciones diferenciales c1
 
Nts
NtsNts
Nts
 
Integrated exercise a_(book_2_B)_Ans
Integrated exercise a_(book_2_B)_AnsIntegrated exercise a_(book_2_B)_Ans
Integrated exercise a_(book_2_B)_Ans
 
Emat 213 midterm 1 winter 2006
Emat 213 midterm 1 winter 2006Emat 213 midterm 1 winter 2006
Emat 213 midterm 1 winter 2006
 
Top Schools in delhi NCR
Top Schools in delhi NCRTop Schools in delhi NCR
Top Schools in delhi NCR
 
Methods of variation of parameters- advance engineering mathe mathematics
Methods of variation of parameters- advance engineering mathe mathematicsMethods of variation of parameters- advance engineering mathe mathematics
Methods of variation of parameters- advance engineering mathe mathematics
 
19 6
19 619 6
19 6
 
Differential equation
Differential equation Differential equation
Differential equation
 
Lesson 12: Implicit Differentiation
Lesson 12: Implicit DifferentiationLesson 12: Implicit Differentiation
Lesson 12: Implicit Differentiation
 
Integrated Math 2 Section 3-8
Integrated Math 2 Section 3-8Integrated Math 2 Section 3-8
Integrated Math 2 Section 3-8
 
Chapter 15
Chapter 15Chapter 15
Chapter 15
 
Differential equation.ypm
Differential equation.ypmDifferential equation.ypm
Differential equation.ypm
 
Week 2
Week 2 Week 2
Week 2
 
Introduction to Numerical Methods for Differential Equations
Introduction to Numerical Methods for Differential EquationsIntroduction to Numerical Methods for Differential Equations
Introduction to Numerical Methods for Differential Equations
 
Hw 4 sol
Hw 4 solHw 4 sol
Hw 4 sol
 
Matematika Ekonomi Diferensiasi fungsi sederhana
Matematika Ekonomi Diferensiasi fungsi sederhanaMatematika Ekonomi Diferensiasi fungsi sederhana
Matematika Ekonomi Diferensiasi fungsi sederhana
 
T tests anovas and regression
T tests anovas and regressionT tests anovas and regression
T tests anovas and regression
 

En vedette

M1 unit ii-jntuworld
M1 unit ii-jntuworldM1 unit ii-jntuworld
M1 unit ii-jntuworld
mrecedu
 
Lesson 4 - Calculating Limits (Slides+Notes)
Lesson 4 - Calculating Limits (Slides+Notes)Lesson 4 - Calculating Limits (Slides+Notes)
Lesson 4 - Calculating Limits (Slides+Notes)
Matthew Leingang
 
Lesson 4: Calculating Limits
Lesson 4: Calculating LimitsLesson 4: Calculating Limits
Lesson 4: Calculating Limits
Matthew Leingang
 
Lesson 8: Determinants III
Lesson 8: Determinants IIILesson 8: Determinants III
Lesson 8: Determinants III
Matthew Leingang
 
Lesson 3: The limit of a function
Lesson 3: The limit of a functionLesson 3: The limit of a function
Lesson 3: The limit of a function
Matthew Leingang
 
Lesson02 Vectors And Matrices Slides
Lesson02   Vectors And Matrices SlidesLesson02   Vectors And Matrices Slides
Lesson02 Vectors And Matrices Slides
Matthew Leingang
 
Lesson 4: Lines and Planes (slides + notes)
Lesson 4: Lines and Planes (slides + notes)Lesson 4: Lines and Planes (slides + notes)
Lesson 4: Lines and Planes (slides + notes)
Matthew Leingang
 
Lesson 6 - Introduction To Determinants (Slides+Notes)
Lesson 6 - Introduction To  Determinants (Slides+Notes)Lesson 6 - Introduction To  Determinants (Slides+Notes)
Lesson 6 - Introduction To Determinants (Slides+Notes)
Matthew Leingang
 
Lesson 1: Systems of Linear Equations (slides)
Lesson 1: Systems of Linear Equations (slides)Lesson 1: Systems of Linear Equations (slides)
Lesson 1: Systems of Linear Equations (slides)
Matthew Leingang
 
Lesson 9: Gaussian Elimination
Lesson 9: Gaussian EliminationLesson 9: Gaussian Elimination
Lesson 9: Gaussian Elimination
Matthew Leingang
 
Lesson 7: Limits at Infinity
Lesson 7: Limits at InfinityLesson 7: Limits at Infinity
Lesson 7: Limits at Infinity
Matthew Leingang
 
Lesson 5: Matrix Algebra (slides)
Lesson 5: Matrix Algebra (slides)Lesson 5: Matrix Algebra (slides)
Lesson 5: Matrix Algebra (slides)
Matthew Leingang
 
Cryptography an application of vectors and matrices
Cryptography an application of vectors and matricesCryptography an application of vectors and matrices
Cryptography an application of vectors and matrices
dianasc04
 
Lesson03 Dot Product And Matrix Multiplication Slides Notes
Lesson03    Dot  Product And  Matrix  Multiplication Slides NotesLesson03    Dot  Product And  Matrix  Multiplication Slides Notes
Lesson03 Dot Product And Matrix Multiplication Slides Notes
Matthew Leingang
 

En vedette (20)

M1 unit ii-jntuworld
M1 unit ii-jntuworldM1 unit ii-jntuworld
M1 unit ii-jntuworld
 
Lesson 4 - Calculating Limits (Slides+Notes)
Lesson 4 - Calculating Limits (Slides+Notes)Lesson 4 - Calculating Limits (Slides+Notes)
Lesson 4 - Calculating Limits (Slides+Notes)
 
Lesson 4: Calculating Limits
Lesson 4: Calculating LimitsLesson 4: Calculating Limits
Lesson 4: Calculating Limits
 
Lesson 8: Determinants III
Lesson 8: Determinants IIILesson 8: Determinants III
Lesson 8: Determinants III
 
Lesson 3: The limit of a function
Lesson 3: The limit of a functionLesson 3: The limit of a function
Lesson 3: The limit of a function
 
Lesson02 Vectors And Matrices Slides
Lesson02   Vectors And Matrices SlidesLesson02   Vectors And Matrices Slides
Lesson02 Vectors And Matrices Slides
 
Lesson 7: Determinants II
Lesson 7: Determinants IILesson 7: Determinants II
Lesson 7: Determinants II
 
Lesson 4: Lines and Planes (slides + notes)
Lesson 4: Lines and Planes (slides + notes)Lesson 4: Lines and Planes (slides + notes)
Lesson 4: Lines and Planes (slides + notes)
 
Lesson 6 - Introduction To Determinants (Slides+Notes)
Lesson 6 - Introduction To  Determinants (Slides+Notes)Lesson 6 - Introduction To  Determinants (Slides+Notes)
Lesson 6 - Introduction To Determinants (Slides+Notes)
 
Lesson 1: Systems of Linear Equations (slides)
Lesson 1: Systems of Linear Equations (slides)Lesson 1: Systems of Linear Equations (slides)
Lesson 1: Systems of Linear Equations (slides)
 
Lesson 9: Gaussian Elimination
Lesson 9: Gaussian EliminationLesson 9: Gaussian Elimination
Lesson 9: Gaussian Elimination
 
Lesson 7: Limits at Infinity
Lesson 7: Limits at InfinityLesson 7: Limits at Infinity
Lesson 7: Limits at Infinity
 
Keplerian orbital elements (lecture 2)
Keplerian orbital elements (lecture 2)Keplerian orbital elements (lecture 2)
Keplerian orbital elements (lecture 2)
 
Lesson 5: Matrix Algebra (slides)
Lesson 5: Matrix Algebra (slides)Lesson 5: Matrix Algebra (slides)
Lesson 5: Matrix Algebra (slides)
 
Laws of artificial satellites motion (Lecture 1)
Laws of artificial satellites motion (Lecture 1)Laws of artificial satellites motion (Lecture 1)
Laws of artificial satellites motion (Lecture 1)
 
Lesson 6: Polar, Cylindrical, and Spherical coordinates
Lesson 6: Polar, Cylindrical, and Spherical coordinatesLesson 6: Polar, Cylindrical, and Spherical coordinates
Lesson 6: Polar, Cylindrical, and Spherical coordinates
 
Making Lesson Plans
Making Lesson PlansMaking Lesson Plans
Making Lesson Plans
 
Coordinate systems (Lecture 3)
Coordinate systems (Lecture 3)Coordinate systems (Lecture 3)
Coordinate systems (Lecture 3)
 
Cryptography an application of vectors and matrices
Cryptography an application of vectors and matricesCryptography an application of vectors and matrices
Cryptography an application of vectors and matrices
 
Lesson03 Dot Product And Matrix Multiplication Slides Notes
Lesson03    Dot  Product And  Matrix  Multiplication Slides NotesLesson03    Dot  Product And  Matrix  Multiplication Slides Notes
Lesson03 Dot Product And Matrix Multiplication Slides Notes
 

Similaire à Lesson31 Higher Dimensional First Order Difference Equations Slides

Lesson 16 The Spectral Theorem and Applications
Lesson 16  The Spectral Theorem and ApplicationsLesson 16  The Spectral Theorem and Applications
Lesson 16 The Spectral Theorem and Applications
Matthew Leingang
 
Jam 2006 Test Papers Mathematical Statistics
Jam 2006 Test Papers Mathematical StatisticsJam 2006 Test Papers Mathematical Statistics
Jam 2006 Test Papers Mathematical Statistics
ashu29
 
Week 7 [compatibility mode]
Week 7 [compatibility mode]Week 7 [compatibility mode]
Week 7 [compatibility mode]
Hazrul156
 
Week 8 [compatibility mode]
Week 8 [compatibility mode]Week 8 [compatibility mode]
Week 8 [compatibility mode]
Hazrul156
 

Similaire à Lesson31 Higher Dimensional First Order Difference Equations Slides (20)

Lesson 13: Rank and Solutions to Systems of Linear Equations
Lesson 13: Rank and Solutions to Systems of Linear EquationsLesson 13: Rank and Solutions to Systems of Linear Equations
Lesson 13: Rank and Solutions to Systems of Linear Equations
 
Lesson 16 The Spectral Theorem and Applications
Lesson 16  The Spectral Theorem and ApplicationsLesson 16  The Spectral Theorem and Applications
Lesson 16 The Spectral Theorem and Applications
 
Notes on eigenvalues
Notes on eigenvaluesNotes on eigenvalues
Notes on eigenvalues
 
Pydata Katya Vasilaky
Pydata Katya VasilakyPydata Katya Vasilaky
Pydata Katya Vasilaky
 
Imc2016 day2-solutions
Imc2016 day2-solutionsImc2016 day2-solutions
Imc2016 day2-solutions
 
Assignment6
Assignment6Assignment6
Assignment6
 
Ch02 5
Ch02 5Ch02 5
Ch02 5
 
Ch07 6
Ch07 6Ch07 6
Ch07 6
 
Partial midterm set7 soln linear algebra
Partial midterm set7 soln linear algebraPartial midterm set7 soln linear algebra
Partial midterm set7 soln linear algebra
 
03_AJMS_166_18_RA.pdf
03_AJMS_166_18_RA.pdf03_AJMS_166_18_RA.pdf
03_AJMS_166_18_RA.pdf
 
03_AJMS_166_18_RA.pdf
03_AJMS_166_18_RA.pdf03_AJMS_166_18_RA.pdf
03_AJMS_166_18_RA.pdf
 
On the Seidel’s Method, a Stronger Contraction Fixed Point Iterative Method o...
On the Seidel’s Method, a Stronger Contraction Fixed Point Iterative Method o...On the Seidel’s Method, a Stronger Contraction Fixed Point Iterative Method o...
On the Seidel’s Method, a Stronger Contraction Fixed Point Iterative Method o...
 
Jam 2006 Test Papers Mathematical Statistics
Jam 2006 Test Papers Mathematical StatisticsJam 2006 Test Papers Mathematical Statistics
Jam 2006 Test Papers Mathematical Statistics
 
Lsd ee n
Lsd ee nLsd ee n
Lsd ee n
 
Strum Liouville Problems in Eigenvalues and Eigenfunctions
Strum Liouville Problems in Eigenvalues and EigenfunctionsStrum Liouville Problems in Eigenvalues and Eigenfunctions
Strum Liouville Problems in Eigenvalues and Eigenfunctions
 
Lecture_note2.pdf
Lecture_note2.pdfLecture_note2.pdf
Lecture_note2.pdf
 
linearequns-classx-180912070018.pdf
linearequns-classx-180912070018.pdflinearequns-classx-180912070018.pdf
linearequns-classx-180912070018.pdf
 
CLASS X MATHS LINEAR EQUATIONS
CLASS X MATHS LINEAR EQUATIONSCLASS X MATHS LINEAR EQUATIONS
CLASS X MATHS LINEAR EQUATIONS
 
Week 7 [compatibility mode]
Week 7 [compatibility mode]Week 7 [compatibility mode]
Week 7 [compatibility mode]
 
Week 8 [compatibility mode]
Week 8 [compatibility mode]Week 8 [compatibility mode]
Week 8 [compatibility mode]
 

Plus de Matthew Leingang

Plus de Matthew Leingang (20)

Streamlining assessment, feedback, and archival with auto-multiple-choice
Streamlining assessment, feedback, and archival with auto-multiple-choiceStreamlining assessment, feedback, and archival with auto-multiple-choice
Streamlining assessment, feedback, and archival with auto-multiple-choice
 
Electronic Grading of Paper Assessments
Electronic Grading of Paper AssessmentsElectronic Grading of Paper Assessments
Electronic Grading of Paper Assessments
 
Lesson 27: Integration by Substitution (slides)
Lesson 27: Integration by Substitution (slides)Lesson 27: Integration by Substitution (slides)
Lesson 27: Integration by Substitution (slides)
 
Lesson 26: The Fundamental Theorem of Calculus (slides)
Lesson 26: The Fundamental Theorem of Calculus (slides)Lesson 26: The Fundamental Theorem of Calculus (slides)
Lesson 26: The Fundamental Theorem of Calculus (slides)
 
Lesson 26: The Fundamental Theorem of Calculus (slides)
Lesson 26: The Fundamental Theorem of Calculus (slides)Lesson 26: The Fundamental Theorem of Calculus (slides)
Lesson 26: The Fundamental Theorem of Calculus (slides)
 
Lesson 27: Integration by Substitution (handout)
Lesson 27: Integration by Substitution (handout)Lesson 27: Integration by Substitution (handout)
Lesson 27: Integration by Substitution (handout)
 
Lesson 26: The Fundamental Theorem of Calculus (handout)
Lesson 26: The Fundamental Theorem of Calculus (handout)Lesson 26: The Fundamental Theorem of Calculus (handout)
Lesson 26: The Fundamental Theorem of Calculus (handout)
 
Lesson 25: Evaluating Definite Integrals (slides)
Lesson 25: Evaluating Definite Integrals (slides)Lesson 25: Evaluating Definite Integrals (slides)
Lesson 25: Evaluating Definite Integrals (slides)
 
Lesson 25: Evaluating Definite Integrals (handout)
Lesson 25: Evaluating Definite Integrals (handout)Lesson 25: Evaluating Definite Integrals (handout)
Lesson 25: Evaluating Definite Integrals (handout)
 
Lesson 24: Areas and Distances, The Definite Integral (handout)
Lesson 24: Areas and Distances, The Definite Integral (handout)Lesson 24: Areas and Distances, The Definite Integral (handout)
Lesson 24: Areas and Distances, The Definite Integral (handout)
 
Lesson 24: Areas and Distances, The Definite Integral (slides)
Lesson 24: Areas and Distances, The Definite Integral (slides)Lesson 24: Areas and Distances, The Definite Integral (slides)
Lesson 24: Areas and Distances, The Definite Integral (slides)
 
Lesson 23: Antiderivatives (slides)
Lesson 23: Antiderivatives (slides)Lesson 23: Antiderivatives (slides)
Lesson 23: Antiderivatives (slides)
 
Lesson 23: Antiderivatives (slides)
Lesson 23: Antiderivatives (slides)Lesson 23: Antiderivatives (slides)
Lesson 23: Antiderivatives (slides)
 
Lesson 22: Optimization Problems (slides)
Lesson 22: Optimization Problems (slides)Lesson 22: Optimization Problems (slides)
Lesson 22: Optimization Problems (slides)
 
Lesson 22: Optimization Problems (handout)
Lesson 22: Optimization Problems (handout)Lesson 22: Optimization Problems (handout)
Lesson 22: Optimization Problems (handout)
 
Lesson 21: Curve Sketching (slides)
Lesson 21: Curve Sketching (slides)Lesson 21: Curve Sketching (slides)
Lesson 21: Curve Sketching (slides)
 
Lesson 21: Curve Sketching (handout)
Lesson 21: Curve Sketching (handout)Lesson 21: Curve Sketching (handout)
Lesson 21: Curve Sketching (handout)
 
Lesson 20: Derivatives and the Shapes of Curves (slides)
Lesson 20: Derivatives and the Shapes of Curves (slides)Lesson 20: Derivatives and the Shapes of Curves (slides)
Lesson 20: Derivatives and the Shapes of Curves (slides)
 
Lesson 20: Derivatives and the Shapes of Curves (handout)
Lesson 20: Derivatives and the Shapes of Curves (handout)Lesson 20: Derivatives and the Shapes of Curves (handout)
Lesson 20: Derivatives and the Shapes of Curves (handout)
 
Lesson 19: The Mean Value Theorem (slides)
Lesson 19: The Mean Value Theorem (slides)Lesson 19: The Mean Value Theorem (slides)
Lesson 19: The Mean Value Theorem (slides)
 

Dernier

Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 

Dernier (20)

Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot ModelMcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 
Vector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptxVector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptx
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
 
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontology
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 

Lesson31 Higher Dimensional First Order Difference Equations Slides

  • 1. Lesson 31 First Order, Higher Dimensional Difference Equations Math 20 April 30, 2007 Announcements PS 12 due Wednesday, May 2 MT III Friday, May 4 in SC Hall A Final Exam: Friday, May 25 at 9:15am, Boylston 110 (Fong Auditorium)
  • 2. Recap Higher dimensional linear systems Examples Markov Chains Population Dynamics Solution Qualitative Analysis Diagonal systems Examples Higher dimensional nonlinear
  • 3. one-dimensional linear difference equations Fact The solution to the inhomogeneous difference equation yk+1 = ayk + b (with a = 1) has solution b b yk = ak y0 − + 1−a 1−a Please try not to memorize this. When a and b have actual values, it’s either to follow this process: 1. Start with ak times an undetermined parameter c (this satisfies the homogenized equation) 2. Find the equilibrium value y∗ . 3. Add the two and pick c to match y0 when k = 0.
  • 4. Nonlinear equations sl op e = − Fact 1 slo The equilibrium pe =g y0 point y∗ of the (y ∗) nonlinear y2 difference equation yk+1 = g(yk ) is y1 stable if 1 |g (yk )| < 1. = e op sl
  • 5. Recap Higher dimensional linear systems Examples Markov Chains Population Dynamics Solution Qualitative Analysis Diagonal systems Examples Higher dimensional nonlinear
  • 6. Let’s kick it up a notch and look at the multivariable, linear, homogeneous difference equation y(k + 1) = Ay(k ) (we move the index into parentheses to allow y(k ) to have coordinates and to avoid writing yk,i .)
  • 7. Skipping class Example This example was a Markov chain with transition matrix 0.7 0.8 A= 0.3 0.2 Then the probability of going or skipping on day k satisfies the equation p(k + 1) = Ap(k )
  • 8. Example Female lobsters have more eggs each season the longer they live. For this reason, it is illegal to keep a lobster that has laid eggs. Let yi be the number of lobsters in a fishery which are i years alive. Then the difference equation might have the simplified form   0 100 400 700 0.1 0 0 0 y(k + 1) =   y(k)  0 0.3 0 0 0 0 0.9 0
  • 9. Mmmm. . . Lobster
  • 10. Formal solution y(1) = Ay(0) y(2) = Ay(1) = A2 y(0) y(3) = Ay(2) = A3 y(0) So
  • 11. Formal solution y(1) = Ay(0) y(2) = Ay(1) = A2 y(0) y(3) = Ay(2) = A3 y(0) So Fact The solution to the homogeneous system of linear difference equations y(k + 1) = Ay(k) is y(k) = Ak y(0)
  • 12. Flop count To multiply two n × n matrices takes n3 (n − 1) additions or multiplications (flop=floating point operation)
  • 13. Flop count To multiply two n × n matrices takes n3 (n − 1) additions or multiplications (flop=floating point operation) So finding Ak takes about n4k flops!
  • 14. Now what? Suppose v is an eigenvector of A with eigenvalue λ . Then the solution to the problem y(k + 1) = Ay(k), y(0) = v is
  • 15. Now what? Suppose v is an eigenvector of A with eigenvalue λ . Then the solution to the problem y(k + 1) = Ay(k), y(0) = v is y(k ) = λ k v
  • 16. Now what? Suppose v is an eigenvector of A with eigenvalue λ . Then the solution to the problem y(k + 1) = Ay(k), y(0) = v is y(k ) = λ k v Suppose y(0) = c1 v1 + c2 v2 + · · · + cm vm
  • 17. Now what? Suppose v is an eigenvector of A with eigenvalue λ . Then the solution to the problem y(k + 1) = Ay(k), y(0) = v is y(k ) = λ k v Suppose y(0) = c1 v1 + c2 v2 + · · · + cm vm Then Ay(0) = c1 λ1 v1 + c2 λ2 v2 + · · · + cm λm vm A2 y(0) = c1 λ1 v1 + c2 λ2 v2 + · · · + cm λm vm 2 2 2
  • 18. Now what? Suppose v is an eigenvector of A with eigenvalue λ . Then the solution to the problem y(k + 1) = Ay(k), y(0) = v is y(k ) = λ k v Suppose y(0) = c1 v1 + c2 v2 + · · · + cm vm Then Ay(0) = c1 λ1 v1 + c2 λ2 v2 + · · · + cm λm vm A2 y(0) = c1 λ1 v1 + c2 λ2 v2 + · · · + cm λm vm 2 2 2 If A is diagonalizable, we can take m = n and write any initial vector as a linear combination of eigenvalues.
  • 19. The big picture Fact Let A have a complete system of eigenvalues and eigenvectors λ1 , λ2 , . . . , λn and v1 , v2 , . . . , vn . Then the solution to the difference equation y(k + 1) = Ay(k) is y(k ) = Ak y(0) = c1 λ1 v1 + c2 λ2 v2 + · · · + cn λn vn k k k where c1 , c2 , . . . , cn are chosen to make y(0) = c1 v1 + c2 v2 + · · · + cn vn
  • 20. Recap Higher dimensional linear systems Examples Markov Chains Population Dynamics Solution Qualitative Analysis Diagonal systems Examples Higher dimensional nonlinear
  • 21. Iterating diagonal systems Consider a 2 × 2 matrix of the form λ1 0 D= 0 λ2 Then the λ ’s tell the behavior of the system.
  • 22. Picture in terms of eigenvalues λ1 > λ2 > 1: repulsion away from the origin
  • 23. Picture in terms of eigenvalues λ1 > λ2 > 1: repulsion away from the origin 1 > λ1 > λ2 > 0: attraction to the origin
  • 24. Picture in terms of eigenvalues λ1 > λ2 > 1: repulsion away from the origin 1 > λ1 > λ2 > 0: attraction to the origin λ1 > 1 > λ2 : saddle point
  • 25. Picture in terms of eigenvalues λ1 > λ2 > 1: repulsion away from the origin 1 > λ1 > λ2 > 0: attraction to the origin λ1 > 1 > λ2 : saddle point
  • 26. Picture in terms of eigenvalues λ1 > λ2 > 1: repulsion away from the origin 1 > λ1 > λ2 > 0: attraction to the origin λ1 > 1 > λ2 : saddle point For negative eigenvalues just square them and use the above results.
  • 27. Back to skipping class Example If 0.7 0.8 A= 0.3 0.2
  • 28. Back to skipping class Example If 0.7 0.8 A= 0.3 0.2 The eigenvectors (in decreasing order of absolute value) are −1 8/11 1 with eigenvalue 1 and 12 with eigenvalue − 10 . 3/11 2
  • 29. Back to skipping class Example If 0.7 0.8 A= 0.3 0.2 The eigenvectors (in decreasing order of absolute value) are −1 8/11 1 with eigenvalue 1 and 12 with eigenvalue − 10 . So the 3/11 2 8/11 system converges to a multiple of 3 . /11
  • 30. Back to the lobsters We had   0 100 400 700 0.1 0 0 0 A=   0 0.3 0 0 0 0 0.9 0 The eigenvalues are 3.80293, −2.84895, −0.476993 + 1.23164i, −0.476993 − 1.23164i and the first eigenvector is T 0.999716 0.0233099 0.00489153
  • 31. Back to the lobsters We had   0 100 400 700 0.1 0 0 0 A=   0 0.3 0 0 0 0 0.9 0 The eigenvalues are 3.80293, −2.84895, −0.476993 + 1.23164i, −0.476993 − 1.23164i and the first eigenvector is T 0.999716 0.0233099 0.00489153 The population will grow despite the increased harvesting!
  • 32. Recap Higher dimensional linear systems Examples Markov Chains Population Dynamics Solution Qualitative Analysis Diagonal systems Examples Higher dimensional nonlinear
  • 33. The nonlinear case Consider now the nonlinear system y(k + 1) = g(y(k)). The process is as it was with the one-dimensional nonlinear: 1. Look for equilibria y∗ with g(y∗ ) = y∗ 2. Linearize about the equilibrium using the matrix ∂ gi A = Dg(y∗ ) = ∂ yj 3. The eigenvalues of A determine the stability of y∗ .