SlideShare une entreprise Scribd logo
1  sur  20
Télécharger pour lire hors ligne
Lecture 3:
       Program Analysis
         Steven Skiena

Department of Computer Science
 State University of New York
 Stony Brook, NY 11794–4400

http://www.cs.sunysb.edu/∼skiena
Problem of the Day
Find two functions f (n) and g(n) that satisfy the following
relationship. If no such f and g exist, write ”None”.
1. f (n) = o(g(n)) and f (n) = Θ(g(n))
2. f (n) = Θ(g(n)) and f (n) = o(g(n))
3. f (n) = Θ(g(n)) and f (n) = O(g(n))
4. f (n) = Ω(g(n)) and f (n) = O(g(n))
Asymptotic Dominance in Action

   n f (n)         lg n       n          n lg n      n2           2n             n!
   10              0.003 µs   0.01 µs    0.033 µs    0.1 µs       1 µs           3.63 ms
   20              0.004 µs   0.02 µs    0.086 µs    0.4 µs       1 ms           77.1 years
   30              0.005 µs   0.03 µs    0.147 µs    0.9 µs       1 sec          8.4 × 1015 yrs
   40              0.005 µs   0.04 µs    0.213 µs    1.6 µs       18.3 min
   50              0.006 µs   0.05 µs    0.282 µs    2.5 µs       13 days
   100             0.007 µs   0.1 µs     0.644 µs    10 µs        4 × 1013 yrs
   1,000           0.010 µs   1.00 µs    9.966 µs    1 ms
   10,000          0.013 µs   10 µs      130 µs      100 ms
   100,000         0.017 µs   0.10 ms    1.67 ms     10 sec
   1,000,000       0.020 µs   1 ms       19.93 ms    16.7 min
   10,000,000      0.023 µs   0.01 sec   0.23 sec    1.16 days
   100,000,000     0.027 µs   0.10 sec   2.66 sec    115.7 days
   1,000,000,000   0.030 µs   1 sec      29.90 sec   31.7 years
Implications of Dominance

 • Exponential algorithms get hopeless fast.
 • Quadratic algorithms get hopeless at or before 1,000,000.
 • O(n log n) is possible to about one billion.
 • O(log n) never sweats.
Testing Dominance
f (n) dominates g(n) if lim n→∞ g(n)/f (n) = 0, which is the
same as saying g(n) = o(f (n)).
Note the little-oh – it means “grows strictly slower than”.
Implications of Dominance

 • na dominates nb if a > b since
                    lim nb /na = nb−a → 0
                   n→∞

 • na + o(na) doesn’t dominate na since
                   lim na/(na + o(na)) → 1
                  n→∞
Dominance Rankings
You must come to accept the dominance ranking of the basic
functions:
     n!   2n    n3    n2    n log n   n    log n   1
Advanced Dominance Rankings
Additional functions arise in more sophisticated analysis than
we will do in this course:

          n      3     2     1+                    √
   n!     c      n    n      n      n log n    n     n
log 2 n    log n   log n/ log log n  log log n   α(n)     1
Reasoning About Efficiency
Grossly reasoning about the running time of an algorithm is
usually easy given a precise-enough written description of the
algorithm.
When you really understand an algorithm, this analysis can
be done in your head. However, recognize there is always
implicitly a written algorithm/program we are reasoning
about.
Selection Sort

selection sort(int s[], int n)
       {
            int i,j;
            int min;

             for (i=0; i<n; i++) {
                    min=i;
                    for (j=i+1; j<n; j++)
                           if (s[j] < s[min]) min=j;
                    swap(&s[i],&s[min]);
             }
}
Worst Case Analysis
The outer loop goes around n times.
The inner loop goes around at most n times for each iteration
of the outer loop
Thus selection sort takes at most n × n → O(n2 ) time in the
worst case.
More Careful Analysis
An exact count of the number of times the if statement is
executed is given by:
                      n−1 n−1          n−1
             S(n) =               1=         n−i−1
                      i=0 j=i+1        i=0


S(n) = (n − 2) + (n − 3) + . . . + 2 + 1 + 0 = (n − 1)(n − 2)/2
Thus the worst case running time is Θ(n2 ).
Logarithms
It is important to understand deep in your bones what
logarithms are and where they come from.
A logarithm is simply an inverse exponential function.
Saying bx = y is equivalent to saying that x = log b y.
Logarithms reflect how many times we can double something
until we get to n, or halve something until we get to 1.
Binary Search
In binary search we throw away half the possible number of
keys after each comparison. Thus twenty comparisons suffice
to find any name in the million-name Manhattan phone book!
How many time can we halve n before getting to 1?
Answer: lg n .
Logarithms and Trees
How tall a binary tree do we need until we have n leaves?
The number of potential leaves doubles with each level.
How many times can we double 1 until we get to n?
Answer: lg n .
Logarithms and Bits
How many bits do you need to represent the numbers from 0
to 2i − 1?
Each bit you add doubles the possible number of bit patterns,
so the number of bits equals lg(2i ) = i.
Logarithms and Multiplication
Recall that
               loga(xy) = log a(x) + loga(y)
This is how people used to multiply before calculators, and
remains useful for analysis.
What if x = a?
The Base is not Asymptotically Important

Recall the definition, clogc x = x and that
                                  logc a
                        logb a =
                                  log c b
Thus log 2 n = (1/ log 100 2) × log 100 n. Since 1/ log 100 2 =
6.643 is just a constant, it does not matter in the Big Oh.
Federal Sentencing Guidelines
2F1.1. Fraud and Deceit; Forgery; Offenses Involving Altered or Counterfeit Instruments other than Counterfeit Bearer Obligations of the United States.
(a) Base offense Level: 6
(b) Specific offense Characteristics


(1) If the loss exceeded $2,000, increase the offense level as follows:

                                                        Loss(Apply the Greatest)     Increase in Level
                                                        (A) $2,000 or less                 no increase
                                                        (B) More than $2,000                     add 1
                                                        (C) More than $5,000                     add 2
                                                        (D) More than $10,000                    add 3
                                                        (E) More than $20,000                    add 4
                                                        (F) More than $40,000                    add 5
                                                        (G) More than $70,000                    add 6
                                                        (H) More than $120,000                   add 7
                                                        (I) More than $200,000                   add 8
                                                        (J) More than $350,000                   add 9
                                                        (K) More than $500,000                  add 10
                                                        (L) More than $800,000                  add 11
                                                        (M) More than $1,500,000                add 12
                                                        (N) More than $2,500,000                add 13
                                                        (O) More than $5,000,000                add 14
                                                        (P) More than $10,000,000               add 15
                                                        (Q) More than $20,000,000               add 16
                                                        (R) More than $40,000,000               add 17
                                                        (Q) More than $80,000,000               add 18
Make the Crime Worth the Time
The increase in punishment level grows logarithmically in the
amount of money stolen.
Thus it pays to commit one big crime rather than many small
crimes totalling the same amount.

Contenu connexe

Similaire à Skiena algorithm 2007 lecture03 modeling logarithms

Skiena algorithm 2007 lecture04 elementary data structures
Skiena algorithm 2007 lecture04 elementary data structuresSkiena algorithm 2007 lecture04 elementary data structures
Skiena algorithm 2007 lecture04 elementary data structures
zukun
 
Lecture4 binary-numbers-logic-operations
Lecture4  binary-numbers-logic-operationsLecture4  binary-numbers-logic-operations
Lecture4 binary-numbers-logic-operations
markme18
 
Ch.13 CapXChapter 13 Capital BudgetingExcel 1Cost$ 3,170dataYear.docx
Ch.13 CapXChapter 13 Capital BudgetingExcel 1Cost$   3,170dataYear.docxCh.13 CapXChapter 13 Capital BudgetingExcel 1Cost$   3,170dataYear.docx
Ch.13 CapXChapter 13 Capital BudgetingExcel 1Cost$ 3,170dataYear.docx
cravennichole326
 
Homework 51)a) the IS curve ln Yt= ln Y(t+1) – (1Ɵ)rtso th.docx
Homework 51)a) the IS curve ln Yt= ln Y(t+1) – (1Ɵ)rtso th.docxHomework 51)a) the IS curve ln Yt= ln Y(t+1) – (1Ɵ)rtso th.docx
Homework 51)a) the IS curve ln Yt= ln Y(t+1) – (1Ɵ)rtso th.docx
adampcarr67227
 

Similaire à Skiena algorithm 2007 lecture03 modeling logarithms (20)

Skiena algorithm 2007 lecture04 elementary data structures
Skiena algorithm 2007 lecture04 elementary data structuresSkiena algorithm 2007 lecture04 elementary data structures
Skiena algorithm 2007 lecture04 elementary data structures
 
Analysis of algo
Analysis of algoAnalysis of algo
Analysis of algo
 
Basic Theory (FE)
Basic Theory (FE)Basic Theory (FE)
Basic Theory (FE)
 
Mathematical Finance
Mathematical Finance Mathematical Finance
Mathematical Finance
 
Lecture4 binary-numbers-logic-operations
Lecture4  binary-numbers-logic-operationsLecture4  binary-numbers-logic-operations
Lecture4 binary-numbers-logic-operations
 
Ch.13 CapXChapter 13 Capital BudgetingExcel 1Cost$ 3,170dataYear.docx
Ch.13 CapXChapter 13 Capital BudgetingExcel 1Cost$   3,170dataYear.docxCh.13 CapXChapter 13 Capital BudgetingExcel 1Cost$   3,170dataYear.docx
Ch.13 CapXChapter 13 Capital BudgetingExcel 1Cost$ 3,170dataYear.docx
 
Class 18: Measuring Cost
Class 18: Measuring CostClass 18: Measuring Cost
Class 18: Measuring Cost
 
Hprec5.2
Hprec5.2Hprec5.2
Hprec5.2
 
Learning Deep Learning
Learning Deep LearningLearning Deep Learning
Learning Deep Learning
 
Continuosly compound interest and a comparison of exponential growth phenomena
Continuosly compound interest and a comparison of exponential growth phenomenaContinuosly compound interest and a comparison of exponential growth phenomena
Continuosly compound interest and a comparison of exponential growth phenomena
 
Homework 51)a) the IS curve ln Yt= ln Y(t+1) – (1Ɵ)rtso th.docx
Homework 51)a) the IS curve ln Yt= ln Y(t+1) – (1Ɵ)rtso th.docxHomework 51)a) the IS curve ln Yt= ln Y(t+1) – (1Ɵ)rtso th.docx
Homework 51)a) the IS curve ln Yt= ln Y(t+1) – (1Ɵ)rtso th.docx
 
CMSC 56 | Lecture 8: Growth of Functions
CMSC 56 | Lecture 8: Growth of FunctionsCMSC 56 | Lecture 8: Growth of Functions
CMSC 56 | Lecture 8: Growth of Functions
 
Time complexity
Time complexityTime complexity
Time complexity
 
Lesson 5: Tangents, Velocity, the Derivative
Lesson 5: Tangents, Velocity, the DerivativeLesson 5: Tangents, Velocity, the Derivative
Lesson 5: Tangents, Velocity, the Derivative
 
orders of_growth
orders of_growthorders of_growth
orders of_growth
 
Succesive differntiation
Succesive differntiationSuccesive differntiation
Succesive differntiation
 
Chapter 5
Chapter 5Chapter 5
Chapter 5
 
Lesson 9: The Product and Quotient Rules (slides)
Lesson 9: The Product and Quotient Rules (slides)Lesson 9: The Product and Quotient Rules (slides)
Lesson 9: The Product and Quotient Rules (slides)
 
Lesson 9: The Product and Quotient Rules (slides)
Lesson 9: The Product and Quotient Rules (slides)Lesson 9: The Product and Quotient Rules (slides)
Lesson 9: The Product and Quotient Rules (slides)
 
An Introduction to Linear Programming
An Introduction to Linear ProgrammingAn Introduction to Linear Programming
An Introduction to Linear Programming
 

Plus de zukun

My lyn tutorial 2009
My lyn tutorial 2009My lyn tutorial 2009
My lyn tutorial 2009
zukun
 
ETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCVETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCV
zukun
 
ETHZ CV2012: Information
ETHZ CV2012: InformationETHZ CV2012: Information
ETHZ CV2012: Information
zukun
 
Siwei lyu: natural image statistics
Siwei lyu: natural image statisticsSiwei lyu: natural image statistics
Siwei lyu: natural image statistics
zukun
 
Lecture9 camera calibration
Lecture9 camera calibrationLecture9 camera calibration
Lecture9 camera calibration
zukun
 
Brunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer visionBrunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer vision
zukun
 
Modern features-part-4-evaluation
Modern features-part-4-evaluationModern features-part-4-evaluation
Modern features-part-4-evaluation
zukun
 
Modern features-part-3-software
Modern features-part-3-softwareModern features-part-3-software
Modern features-part-3-software
zukun
 
Modern features-part-2-descriptors
Modern features-part-2-descriptorsModern features-part-2-descriptors
Modern features-part-2-descriptors
zukun
 
Modern features-part-1-detectors
Modern features-part-1-detectorsModern features-part-1-detectors
Modern features-part-1-detectors
zukun
 
Modern features-part-0-intro
Modern features-part-0-introModern features-part-0-intro
Modern features-part-0-intro
zukun
 
Lecture 02 internet video search
Lecture 02 internet video searchLecture 02 internet video search
Lecture 02 internet video search
zukun
 
Lecture 01 internet video search
Lecture 01 internet video searchLecture 01 internet video search
Lecture 01 internet video search
zukun
 
Lecture 03 internet video search
Lecture 03 internet video searchLecture 03 internet video search
Lecture 03 internet video search
zukun
 
Icml2012 tutorial representation_learning
Icml2012 tutorial representation_learningIcml2012 tutorial representation_learning
Icml2012 tutorial representation_learning
zukun
 
Advances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer visionAdvances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer vision
zukun
 
Gephi tutorial: quick start
Gephi tutorial: quick startGephi tutorial: quick start
Gephi tutorial: quick start
zukun
 
EM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysisEM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysis
zukun
 
Object recognition with pictorial structures
Object recognition with pictorial structuresObject recognition with pictorial structures
Object recognition with pictorial structures
zukun
 
Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities
zukun
 

Plus de zukun (20)

My lyn tutorial 2009
My lyn tutorial 2009My lyn tutorial 2009
My lyn tutorial 2009
 
ETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCVETHZ CV2012: Tutorial openCV
ETHZ CV2012: Tutorial openCV
 
ETHZ CV2012: Information
ETHZ CV2012: InformationETHZ CV2012: Information
ETHZ CV2012: Information
 
Siwei lyu: natural image statistics
Siwei lyu: natural image statisticsSiwei lyu: natural image statistics
Siwei lyu: natural image statistics
 
Lecture9 camera calibration
Lecture9 camera calibrationLecture9 camera calibration
Lecture9 camera calibration
 
Brunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer visionBrunelli 2008: template matching techniques in computer vision
Brunelli 2008: template matching techniques in computer vision
 
Modern features-part-4-evaluation
Modern features-part-4-evaluationModern features-part-4-evaluation
Modern features-part-4-evaluation
 
Modern features-part-3-software
Modern features-part-3-softwareModern features-part-3-software
Modern features-part-3-software
 
Modern features-part-2-descriptors
Modern features-part-2-descriptorsModern features-part-2-descriptors
Modern features-part-2-descriptors
 
Modern features-part-1-detectors
Modern features-part-1-detectorsModern features-part-1-detectors
Modern features-part-1-detectors
 
Modern features-part-0-intro
Modern features-part-0-introModern features-part-0-intro
Modern features-part-0-intro
 
Lecture 02 internet video search
Lecture 02 internet video searchLecture 02 internet video search
Lecture 02 internet video search
 
Lecture 01 internet video search
Lecture 01 internet video searchLecture 01 internet video search
Lecture 01 internet video search
 
Lecture 03 internet video search
Lecture 03 internet video searchLecture 03 internet video search
Lecture 03 internet video search
 
Icml2012 tutorial representation_learning
Icml2012 tutorial representation_learningIcml2012 tutorial representation_learning
Icml2012 tutorial representation_learning
 
Advances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer visionAdvances in discrete energy minimisation for computer vision
Advances in discrete energy minimisation for computer vision
 
Gephi tutorial: quick start
Gephi tutorial: quick startGephi tutorial: quick start
Gephi tutorial: quick start
 
EM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysisEM algorithm and its application in probabilistic latent semantic analysis
EM algorithm and its application in probabilistic latent semantic analysis
 
Object recognition with pictorial structures
Object recognition with pictorial structuresObject recognition with pictorial structures
Object recognition with pictorial structures
 
Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities Iccv2011 learning spatiotemporal graphs of human activities
Iccv2011 learning spatiotemporal graphs of human activities
 

Dernier

Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Victor Rentea
 

Dernier (20)

Spring Boot vs Quarkus the ultimate battle - DevoxxUK
Spring Boot vs Quarkus the ultimate battle - DevoxxUKSpring Boot vs Quarkus the ultimate battle - DevoxxUK
Spring Boot vs Quarkus the ultimate battle - DevoxxUK
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
 
Ransomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdfRansomware_Q4_2023. The report. [EN].pdf
Ransomware_Q4_2023. The report. [EN].pdf
 
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdfRising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
 
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
 
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
 
[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with Milvus
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 

Skiena algorithm 2007 lecture03 modeling logarithms

  • 1. Lecture 3: Program Analysis Steven Skiena Department of Computer Science State University of New York Stony Brook, NY 11794–4400 http://www.cs.sunysb.edu/∼skiena
  • 2. Problem of the Day Find two functions f (n) and g(n) that satisfy the following relationship. If no such f and g exist, write ”None”. 1. f (n) = o(g(n)) and f (n) = Θ(g(n)) 2. f (n) = Θ(g(n)) and f (n) = o(g(n)) 3. f (n) = Θ(g(n)) and f (n) = O(g(n)) 4. f (n) = Ω(g(n)) and f (n) = O(g(n))
  • 3. Asymptotic Dominance in Action n f (n) lg n n n lg n n2 2n n! 10 0.003 µs 0.01 µs 0.033 µs 0.1 µs 1 µs 3.63 ms 20 0.004 µs 0.02 µs 0.086 µs 0.4 µs 1 ms 77.1 years 30 0.005 µs 0.03 µs 0.147 µs 0.9 µs 1 sec 8.4 × 1015 yrs 40 0.005 µs 0.04 µs 0.213 µs 1.6 µs 18.3 min 50 0.006 µs 0.05 µs 0.282 µs 2.5 µs 13 days 100 0.007 µs 0.1 µs 0.644 µs 10 µs 4 × 1013 yrs 1,000 0.010 µs 1.00 µs 9.966 µs 1 ms 10,000 0.013 µs 10 µs 130 µs 100 ms 100,000 0.017 µs 0.10 ms 1.67 ms 10 sec 1,000,000 0.020 µs 1 ms 19.93 ms 16.7 min 10,000,000 0.023 µs 0.01 sec 0.23 sec 1.16 days 100,000,000 0.027 µs 0.10 sec 2.66 sec 115.7 days 1,000,000,000 0.030 µs 1 sec 29.90 sec 31.7 years
  • 4. Implications of Dominance • Exponential algorithms get hopeless fast. • Quadratic algorithms get hopeless at or before 1,000,000. • O(n log n) is possible to about one billion. • O(log n) never sweats.
  • 5. Testing Dominance f (n) dominates g(n) if lim n→∞ g(n)/f (n) = 0, which is the same as saying g(n) = o(f (n)). Note the little-oh – it means “grows strictly slower than”.
  • 6. Implications of Dominance • na dominates nb if a > b since lim nb /na = nb−a → 0 n→∞ • na + o(na) doesn’t dominate na since lim na/(na + o(na)) → 1 n→∞
  • 7. Dominance Rankings You must come to accept the dominance ranking of the basic functions: n! 2n n3 n2 n log n n log n 1
  • 8. Advanced Dominance Rankings Additional functions arise in more sophisticated analysis than we will do in this course: n 3 2 1+ √ n! c n n n n log n n n log 2 n log n log n/ log log n log log n α(n) 1
  • 9. Reasoning About Efficiency Grossly reasoning about the running time of an algorithm is usually easy given a precise-enough written description of the algorithm. When you really understand an algorithm, this analysis can be done in your head. However, recognize there is always implicitly a written algorithm/program we are reasoning about.
  • 10. Selection Sort selection sort(int s[], int n) { int i,j; int min; for (i=0; i<n; i++) { min=i; for (j=i+1; j<n; j++) if (s[j] < s[min]) min=j; swap(&s[i],&s[min]); } }
  • 11. Worst Case Analysis The outer loop goes around n times. The inner loop goes around at most n times for each iteration of the outer loop Thus selection sort takes at most n × n → O(n2 ) time in the worst case.
  • 12. More Careful Analysis An exact count of the number of times the if statement is executed is given by: n−1 n−1 n−1 S(n) = 1= n−i−1 i=0 j=i+1 i=0 S(n) = (n − 2) + (n − 3) + . . . + 2 + 1 + 0 = (n − 1)(n − 2)/2 Thus the worst case running time is Θ(n2 ).
  • 13. Logarithms It is important to understand deep in your bones what logarithms are and where they come from. A logarithm is simply an inverse exponential function. Saying bx = y is equivalent to saying that x = log b y. Logarithms reflect how many times we can double something until we get to n, or halve something until we get to 1.
  • 14. Binary Search In binary search we throw away half the possible number of keys after each comparison. Thus twenty comparisons suffice to find any name in the million-name Manhattan phone book! How many time can we halve n before getting to 1? Answer: lg n .
  • 15. Logarithms and Trees How tall a binary tree do we need until we have n leaves? The number of potential leaves doubles with each level. How many times can we double 1 until we get to n? Answer: lg n .
  • 16. Logarithms and Bits How many bits do you need to represent the numbers from 0 to 2i − 1? Each bit you add doubles the possible number of bit patterns, so the number of bits equals lg(2i ) = i.
  • 17. Logarithms and Multiplication Recall that loga(xy) = log a(x) + loga(y) This is how people used to multiply before calculators, and remains useful for analysis. What if x = a?
  • 18. The Base is not Asymptotically Important Recall the definition, clogc x = x and that logc a logb a = log c b Thus log 2 n = (1/ log 100 2) × log 100 n. Since 1/ log 100 2 = 6.643 is just a constant, it does not matter in the Big Oh.
  • 19. Federal Sentencing Guidelines 2F1.1. Fraud and Deceit; Forgery; Offenses Involving Altered or Counterfeit Instruments other than Counterfeit Bearer Obligations of the United States. (a) Base offense Level: 6 (b) Specific offense Characteristics (1) If the loss exceeded $2,000, increase the offense level as follows: Loss(Apply the Greatest) Increase in Level (A) $2,000 or less no increase (B) More than $2,000 add 1 (C) More than $5,000 add 2 (D) More than $10,000 add 3 (E) More than $20,000 add 4 (F) More than $40,000 add 5 (G) More than $70,000 add 6 (H) More than $120,000 add 7 (I) More than $200,000 add 8 (J) More than $350,000 add 9 (K) More than $500,000 add 10 (L) More than $800,000 add 11 (M) More than $1,500,000 add 12 (N) More than $2,500,000 add 13 (O) More than $5,000,000 add 14 (P) More than $10,000,000 add 15 (Q) More than $20,000,000 add 16 (R) More than $40,000,000 add 17 (Q) More than $80,000,000 add 18
  • 20. Make the Crime Worth the Time The increase in punishment level grows logarithmically in the amount of money stolen. Thus it pays to commit one big crime rather than many small crimes totalling the same amount.