SlideShare une entreprise Scribd logo
1  sur  56
Télécharger pour lire hors ligne
Earned Value for Software
Development
Is NOT a Myth!
Software Engineering Institute
Carnegie Mellon University
Pittsburgh, PA 15213

William Nichols and James McHale
September 2010




                            © 2010 Carnegie Mellon University
Agenda




Why EV won’t Work (for Software Development Projects)
How to Make EV work
Pulling it all Together (TSP)




                                                        Earned Value for Software Development
                                                        SPIN 08-Sept 2010                       2

                                                        © 2008-09 Carnegie Mellon University
Why EV won’t work?




              Earned Value for Software Development
              SPIN 08-Sept 2010                       3

              © 2008-09 Carnegie Mellon University
Why EV won’t work for software

Software development work is hard to estimate with sufficient accuracy.




                                                    Earned Value for Software Development
                                                    SPIN 08-Sept 2010                       4

                                                    © 2008-09 Carnegie Mellon University
Why EV won’t work for software

Software project work is hard to estimate with sufficient accuracy.
You can’t get an accurate progress report from a software developer.




                                                      Earned Value for Software Development
                                                      SPIN 08-Sept 2010                       5

                                                      © 2008-09 Carnegie Mellon University
Why EV won’t work for software

Software development work is hard to estimate with sufficient accuracy.
You can’t get an accurate progress report from a software developer.
Until the software tests successfully we don’t know that we are done.




                                                     Earned Value for Software Development
                                                     SPIN 08-Sept 2010                       6

                                                     © 2008-09 Carnegie Mellon University
Why EV won’t work for software

Software development work is hard to estimate with sufficient accuracy.
You can’t get an accurate progress report from a software developer.
Until the software tests successfully we don’t know that we are done.
Software is iterative, work is revised a number of times before complete




                                                     Earned Value for Software Development
                                                     SPIN 08-Sept 2010                       7

                                                     © 2008-09 Carnegie Mellon University
Why EV won’t work for software

Software development work is hard to estimate with sufficient accuracy.
You can’t get an accurate progress report from a software developer.
Until the software tests successfully we don’t know that we are done.
Software is iterative, work is revised a number of times before complete
Software project have imprecise requirements, scope isn’t fixed




                                                     Earned Value for Software Development
                                                     SPIN 08-Sept 2010                       8

                                                     © 2008-09 Carnegie Mellon University
So what do you do?

The 95% rule for ETC?


Developers are always 95% done, Just ask them. So how much time
  remains?




                                                Earned Value for Software Development
                                                SPIN 08-Sept 2010                       9

                                                © 2008-09 Carnegie Mellon University
So what do you do?

The 95% rule for ETC?


It takes “95% of the estimated schedule/cost to complete 95% of the
    work and ANOTHER 95% TO Finish it!”




                                                 Earned Value for Software Development
                                                 SPIN 08-Sept 2010                       10

                                                 © 2008-09 Carnegie Mellon University
So what do you do?

The 90% rule for ETC?


It takes “90% of the estimated schedule/cost to complete 90% of the
    work and ANOTHER 90% TO Finish it!”




DON’T do this!




                                                 Earned Value for Software Development
                                                 SPIN 08-Sept 2010                       11

                                                 © 2008-09 Carnegie Mellon University
How to make EV work?




              Earned Value for Software Development
              SPIN 08-Sept 2010                       12

              © 2008-09 Carnegie Mellon University
The Four Core Requirements For Earned Value†

A credible schedule of the planned work
A time phased budget for the planned work
A means of collecting progress to plan of the work performed
A means of collecting cost information for the work performed




† The Earned Value Management Maturity Model®, Ray W. Stratton, Management
    Concepts, 2006.




                                                           Earned Value for Software Development
                                                           SPIN 08-Sept 2010                       13

                                                           © 2008-09 Carnegie Mellon University
Software development work
is hard to estimate with
sufficient accuracy.




               Earned Value for Software Development
               SPIN 08-Sept 2010                       14

               © 2008-09 Carnegie Mellon University
Estimate Accurately

Software development work is hard to estimate accurately?


Start by defining “what done looks like”
Decompose the work into work packages.
Learn how to estimate a work package.


Use history as a guide




                                                   Earned Value for Software Development
                                                   SPIN 08-Sept 2010                       15

                                                   © 2008-09 Carnegie Mellon University
Ex: Text Pages versus Writing Time

                 120

                 100

                 80
  Time (hours)




                                              y = 2.4366x + 4.1297
                                                     2
                 60                                R = 0.6094

                 40

                 20

                  0
                       0   10   20       30         40                       50
                                 Text pages
                                                     Earned Value for Software Development
                                                     SPIN 08-Sept 2010                       16

                                                     © 2008-09 Carnegie Mellon University
Ex: LOC versus Development Time


                 6000

                 5000

                 4000
   Time (min.)




                 3000

                 2000                      y = 6.4537x - 252.94
                                                 2
                 1000                          R = 0.9582

                    0
                         0   200     400         600                800
                 -1000
                                   C++ LOC
                                                       Earned Value for Software Development
                                                       SPIN 08-Sept 2010                       17

                                                       © 2008-09 Carnegie Mellon University
PSP Estimating Accuracy
                                       40


                                                PSP 0
Majority are under-estimating          20




                                         0
                                        -200%           -100%                   0%                 100%
                                       40


Balance of over-estimates and under-        PSP 1
estimates                              20




                                         0
                                        -200%           -100%                   0%                 100%
                                       40

Much tighter balance around zero                PSP 2
                                       20




                                         0
                                        -200%           -100%                   0%                 100%
                                                    Effort Estimation Accuracy
                                                            Earned Value for Software Development
                                                            SPIN 08-Sept 2010                             18

                                                            © 2008-09 Carnegie Mellon University
Improving Estimating Accuracy
    Estimated Minutes - Actual Minutes / Estimated Minutes


                                                                           Effort Estimation Accuracy Trend

                                                             0.7

                                                                       PSP0                 PSP1           PSP2
                                                             0.6




                                                             0.5
                                                                                                                                  Mean Time Misestimation
                                                                                                                                  PSP Level Average

                                                             0.4




                                                             0.3


                                                                                                                               298 developers
                                                             0.2
                                                                   0   1      2    3    4    5     6   7   8   9   10   11

                                                                                       Program Number




                                                                                                                             Earned Value for Software Development
                                                                                                                             SPIN 08-Sept 2010                       19

                                                                                                                             © 2008-09 Carnegie Mellon University
You can’t get an accurate
progress report from a
software developer?




                Earned Value for Software Development
                SPIN 08-Sept 2010                       20

                © 2008-09 Carnegie Mellon University
You can’t get an accurate progress report from a
software developer?
Make a credible plan, and track to the plan.
Break the work package into smaller steps.
For each step, now what DONE looks like. Have a standard.
Use history to estimate the effort in each step.




                                                   Earned Value for Software Development
                                                   SPIN 08-Sept 2010                       21

                                                   © 2008-09 Carnegie Mellon University
You can’t get an accurate progress report from a
software developer?
Make a credible plan, and track to the plan.
Break the work package into smaller steps.
For each step, now what DONE looks like. Have a standard.
Use history to estimate the effort in each step.




                                                   Earned Value for Software Development
                                                   SPIN 08-Sept 2010                       22

                                                   © 2008-09 Carnegie Mellon University
How do you break a work package into steps?

Use a process!
                               Plan

                               Build

                             Personal
                              review
                               Team
                            inspection
                               Test
                             Field use


                                 Earned Value for Software Development
                                 SPIN 08-Sept 2010                       23

                                 © 2008-09 Carnegie Mellon University
What is a Process?

A process is a defined and measured set of steps for doing a job.


A process guides your work.


A process is usually defined for a job that is done multiple times.


A process provides a foundation for planning.
 • A process is a template, a generic set of steps.

 • A plan is a set of steps for a specific job, plus other information such as effort,
   costs, and dates.


                                                              Earned Value for Software Development
                                                              SPIN 08-Sept 2010                       24

                                                              © 2008-09 Carnegie Mellon University
TSP Process Elements

         Phase          Purpose                        To guide you in developing module-level programs
                        Inputs Required                Problem description
                                                       PSP project plan summary form
                                                       Time and defect recording logs



                                                                                                                   Scripts    Document the process entry criteria, phases/
                                                       Defect type standard
                                                       Stop watch (optional)
         1              Planning                       - Produce or obtain a requirements statement.
                                                       - Estimate the required development time.
                                                       - Enter the plan data in the project plan summary form.
                                                       - Complete the time log.
         2              Development                    - Design the program.
                                                       - Implement the design.
                                                       - Compile the program and fix and log all defects found.
                                                       - Test the program and fix and log all defects found.
                                                       - Complete the time recording log.
                                                                                                                              steps, and exit criteria. The purpose is to
         3              Postmortem

                        Exit Criteria
                                                       Complete the project plan summary form with actual
                                                       time, defect, and size data.
                                                       - A thoroughly tested program
                                                       - Completed project plan summary with estimated and
                                                         actual data
                                                                                                                              provide expert-level guidance as you use the
                                                       - Completed defect and time logs

                                                                                                                              process.

                                                                                                                   Measures Measure the process and the product. They
 Student
 Program
 Instructor
                                                                           Date
                                                                           Program #
                                                                           Language
                                                                                                                            provide insight into how the process is
 Summary
 LOC/Hour
 Actual Time
 Planned Time
 CPI(Cost-Performance Index)

 % Reuse
 % New Reuse
                                           Plan                     Actual                   To Date




                                                                                           (Actual/Planned)
                                                                                                                            working and the status of the work.
 Test Defects/KLOC
 Total Defects/KLOC
 Yield %
 % Appraisal COQ
 % Failure COQ
 COQ A/F Ratio

 Program Size (LOC):                       Plan                     Actual                   To Date
 Base(B)
                                         (Measured)                 (Measured)
  Deleted (D)
                                         (Estimated)                (Counted)
  Modified (M)
                                         (Estimated)                (Counted)
  Added (A)




                                                                                                                   Forms      Provide a convenient and consistent
                                           (N-M)                    (T-B+D-R)
  Reused (R)
                                         (Estimated)                (Counted)
 Total New & Changed (N)
                                         (Estimated)                 (A+M)
 Total LOC (T)
                                        (N+B-M-D+R)                 (Measured)
 Total New Reused
                                         (Estimated)                (Counted)
 Total Object LOC (E)




                                                                                                                              framework for gathering and retaining
                                         (Estimated)                (Counted)
 Upper Prediction Interval (70%)
 Lower Prediction Interval (70%)

 Time in Phase (min.)                   Plan               Actual                To Date       To Date %
  Planning
  Design




                                                                                                                              data
  Design review
  Code
  Code review
  Compile
  Test
  Postmortem
  Total
  Total Time UPI (70%)
  Total Time LPI (70%)




                                                                                                                              Provide consistent definitions that
                                                                                                                  Standards   guide the work and gathering of data.

                                                                                                                                                   Earned Value for Software Development
                                                                                                                                                   SPIN 08-Sept 2010                       25

                                                                                                                                                   © 2008-09 Carnegie Mellon University
Example Process Script - Requirements




                                        Earned Value for Software Development
                                        SPIN 08-Sept 2010                       26

                                        © 2008-09 Carnegie Mellon University
Until the software tests successfully we don’t
know that we are done?
Then you are done when the test complete successfully!
But, test is highly variable.
Then make a quality plan. Plan to remove the defects you’ve put in.
Defects can result in
    • incorrect functionality

    • poor operation

    • improper installation

    • confusing or incorrect documentation

    • error-prone modification and enhancement



                                                    Earned Value for Software Development
                                                    SPIN 08-Sept 2010                       27

                                                    © 2008-09 Carnegie Mellon University
Why Focus on Quality?

The fastest way to finish is to do it right the first time! Do it right, or do it
over


Completed tasks should be verified with performance measures.


This links to TPM, Technical Performance Measures




                                                            Earned Value for Software Development
                                                            SPIN 08-Sept 2010                       28

                                                            © 2008-09 Carnegie Mellon University
Why Focus on Defects?

In most engineering organizations, a significant number of resources are
dedicated to fixing defects. Often more than 40% of cost and schedule.


Defects are very costly. It is beneficial to find and remove defects early in
the process.


The reasons for managing defects are to
 • produce better products

 • improve your ability to develop products on time and within budget




                                                       Earned Value for Software Development
                                                       SPIN 08-Sept 2010                       29

                                                       © 2008-09 Carnegie Mellon University
Quality Measures

The TSP uses three quality measures for planning and tracking.


1.   Defect injection and removal rates
2.   Defect density
3.   Review rates




                                                   Earned Value for Software Development
                                                   SPIN 08-Sept 2010                       30

                                                   © 2008-09 Carnegie Mellon University
Defect Injection and Removal Rates


Defect Injection Rate is defined as the number of defects injected per hour while
performing activities in a process phase.


Defect Removal Rate is defined as the number of defects removed per hour while
performing activities in a process phase.


Typical defect injection phases include requirements and design.


Typical defect removal phases include reviews, inspections, and testing.




                                                           Earned Value for Software Development
                                                           SPIN 08-Sept 2010                       31

                                                           © 2008-09 Carnegie Mellon University
Defect Density

Defect Density is the ratio of the number of defects removed and the
product size.


A common defect density measure in software projects is number of
defects found per thousand lines of code (defects/KLOC).


An example of another defect density measure, used by the SEI when
developing training slides, is number of defects found per slide.


Defect density is also a good product planning, tracking, and predictive
measure.


                                                     Earned Value for Software Development
                                                     SPIN 08-Sept 2010                       32

                                                     © 2008-09 Carnegie Mellon University
Review Rates

Review rates is the ratio between the size of a product and the time spent
in reviewing or inspecting the product.
A common example in software projects is lines of code reviewed per hour
(LOC/hr).
Another example is number of requirements pages reviewed per hour
(Req Pages/hr).
Review rate is the control variable for inspections and reviews.


It is used to
 • allocate appropriate time during planning
 • predict the effectiveness of the review or inspection



                                                      Earned Value for Software Development
                                                      SPIN 08-Sept 2010                       33

                                                      © 2008-09 Carnegie Mellon University
Defect Injection and Removal

In your process, you will have activities that
 • inject defects                                                     Plan
 • remove defects

Defect injection typically occurs when you                            Build
 • determine the job requirements
                                                                Personal
 • specify the product                                           review
 • build the product
                                                                Team
Defect removal typically occurs when you                     inspection
 • review the products
                                                                       Test
 • test the products

 • use the products                                            Field use

                                                 Earned Value for Software Development
                                                 SPIN 08-Sept 2010                       34

                                                 © 2008-09 Carnegie Mellon University
Defect Removal Techniques

 Reviews (inspections, walkthroughs, personal reviews)
  • examine the product or interim development artifacts of the product
  • find and eliminate defects



 Testing
  • exercises the product or parts of the product
  • proves that the product works correctly
  • identifies potential defects or symptoms



 In many cases, projects rely on testing to find and fix defects.


 When this happens, many defects are found in the field by the customer.
                                                         Earned Value for Software Development
                                                         SPIN 08-Sept 2010                       35

                                                         © 2008-09 Carnegie Mellon University
PSP Quality Results

                                  Defects Per KLOC Removed in Compile and Test

                                 120

                                 110

                                 100
    Mean Number of Defects Per




                                  90
                                                                                                    Mean Compile + Test
                                  80

                                  70
                                                                            PSP2                    PSP Level Mean Comp + Test


                                  60

                                  50
                                                          PSP1
                                  40

                                  30
                                           PSP0
    KLOC




                                  20

                                  10                                                            298 developers
                                   0
                                       0   1   2   3     4   5   6      7   8   9   10   11

                                                       Program Number




                                                                                              Earned Value for Software Development
                                                                                              SPIN 08-Sept 2010                       36

                                                                                              © 2008-09 Carnegie Mellon University
The ‘Worst’ as Good as the ‘Best’?
                     Compile and Test Defects - from PSP Training
                                                                                                        810 developers

               250
               200
Defects/KLOC




                                                                                                         1st Quartile
               150                                                                                       2nd Quartile
               100                                                                                       3rd Quartile
                                                                                                         4th Quartile
               50
                                                                                                         Defect
                0                                                                                        reduction
                                                                                                         1Q: 80.4%




                                                                               10
                     1

                           2

                                  3

                                         4

                                                5

                                                       6

                                                              7

                                                                     8

                                                                                9
                                                                                                         2Q: 79.0%
                  og

                           og

                                  og

                                         og

                                                og

                                                       og

                                                              og

                                                                     og

                                                                             og

                                                                            og
                Pr

                         Pr

                                Pr

                                       Pr

                                              Pr

                                                     Pr

                                                            Pr

                                                                   Pr

                                                                           Pr
                                                                                                         3Q: 78.5%




                                                                          Pr
                                                                                                         4Q: 77.6%
                                        PSP Assignment Number


                                                                             Earned Value for Software Development
                                                                             SPIN 08-Sept 2010                           37

                                                                             © 2008-09 Carnegie Mellon University
Software is iterative, work is
revised a number of times
before complete?




                 Earned Value for Software Development
                 SPIN 08-Sept 2010                       38

                 © 2008-09 Carnegie Mellon University
Software is iterative, work is revised a number of
times before complete?
Incremental and iterative development isn’t a bug, it’s a feature!
DoD 5000.02's procurement cycle, incremental and iterative approaches
  are used.


This is a fact on almost any project. (ever change a leaky faucit?)


Apply the appropriate method to deal with this..




                                                       Earned Value for Software Development
                                                       SPIN 08-Sept 2010                       39

                                                       © 2008-09 Carnegie Mellon University
TSP Cyclic Development Strategy

TSP favors an iterative or cyclic
development strategy.
                                               Launch
 • develop in increments

 • use multiple cycles
                                              Re-launch

 • work-ahead

Projects can start on any phase or cycle.                                 Development
                                                                             phase
                                                                            or cycle
Each phase or cycle starts with a launch
or re-launch.
                                            Phase or cycle
                                             Postmortem
TSP permits whatever process structure
makes the most business and technical
sense.                                         Project
                                             Postmortem



                                                  Earned Value for Software Development
                                                  SPIN 08-Sept 2010                       40

                                                  © 2008-09 Carnegie Mellon University
Core Success Criteria for Earned Value and Software
Development
Define the outcomes of the work effort in some tangible way.
Define the way progress is going to be measured. 0/100% for each work
  task.
Either it's done or it's not done, there is no “we’re trying real hard”.
Define the tangible evidence, the date the tangible evidence is expected,
  and the associated costs.
.




                                                         Earned Value for Software Development
                                                         SPIN 08-Sept 2010                       41

                                                         © 2008-09 Carnegie Mellon University
Software project have
imprecise requirements,
scope isn’t fixed




               Earned Value for Software Development
               SPIN 08-Sept 2010                       42

               © 2008-09 Carnegie Mellon University
Use a Planning Process




Apply the appropriate method to deal with this.
Usually most portions of the project can be planned.
A common SW Dev project mistake is staffing up too soon!
                                                   Earned Value for Software Development
                                                   SPIN 08-Sept 2010                       43

                                                   © 2008-09 Carnegie Mellon University
TSP: Pulling it all together




                 Earned Value for Software Development
                 SPIN 08-Sept 2010                       44

                 © 2008-09 Carnegie Mellon University
Plan Execution -1

Tasks in personal task plans are ordered according to team priorities.
Developers adjust the order as needed and work ahead as needed.
Developers select a task planned for current week and begin tracking time.




                                                           Earned Value for Software Development
                                                           SPIN 08-Sept 2010                       45

                                                           © 2008-09 Carnegie Mellon University
Plan Execution -2

While working they also record          When they are done they
    •   any defects they find               •   stop time tracking

    •   any risks they identify             •   mark task completed if done
                                            •   record size if the process step calls for it
    •   any process improvement ideas




                                                            Earned Value for Software Development
                                                            SPIN 08-Sept 2010                       46

                                                            © 2008-09 Carnegie Mellon University
Monitor and Control the Plan: Script WEEK

Team members meet each week to               Performance Data Reviewed
assess progress.                             •   Baseline Plan Value
                                             •   Plan Value
    •   Role managers present their          •   Earned Value
        evaluation of the team’s plan and    •   Predicted Earned Value
        data.                                •   Earned Value Trend
                                             •   Plan Task Hours
    •   Goal owners present status on        •   Actual Task Hours
        product and business objectives.     •   Tasks/Milestones completed
                                             •   Tasks/Milestones past due
    •   Risk owners present status on risk   •   Tasks/Milestones next 2 weeks
        mitigation plans and new risks.      •   Effort against incomplete tasks
                                             •   Estimation Accuracy
    •   Team members present status on       •   Review and Inspection Rates
        their plans.                         •   Injection Rates
                                             •   Removal Rates
Plan deviations are addressed each           •   Time in Phase Ratios
                                             •   Phase and Process Yield
week.                                        •   Defect Density
                                             •   Quality Profile (QP)
Significant deviations like new              •   QP Index
requirements trigger a replan.               •   Percent Defect Free
                                             •   Defect Removal Profile
                                             •   Plan to Actual Defects Injected/Removed
                                                             Earned Value for Software Development
                                                             SPIN 08-Sept 2010                       47

                                                             © 2008-09 Carnegie Mellon University
Form Week -1

 Teams use Form Week to review status at their weekly meetings.

TSP Week Summary - Form WEEK
                 Name Consolidation                                                                               Date     3/1/2007
                 Team Voyager
        Status for Week            11             Selected Assembly                                              Cycle
            Week Date       1/22/2007   SYSTEM
                                                                                                       Plan /   Plan -
   Task Hours %Change                                               Weekly Data      Plan     Actual   Actual   Actual     Project End Dates
               Baseline        660.8                Schedule hours for this week       51.0     48.1     1.06       2.9    Baseline     3/19/2007
                Current        745.5            Schedule hours this cycle to date    344.0     395.0     0.87     -51.0       Plan      3/26/2007
              %Change         12.8%                   Earned value for this week        5.6      0.7     8.10       4.9   Predicted     7/30/2007
                                                  Earned value this cycle to date      43.8     22.0     1.99      21.8
                                               To-date hours for tasks completed     163.9     314.5     0.52
                                                 To-date average hours per week        31.3     35.9     0.87
                                              EV per completed task hour to date     0.134     0.070

                                                                                                                                        Baseline or
                                                                                     Re-      Plan     Actual   EV or                 Committed Date     Plan Date and       Slip Date and     Predicted Date
      Assembly             Phase                         Task                       source    Hrs.      Hrs.     PV        CPI           and Week            Week                Week            and Week

                                              OPEN MILESTONES
CD Mag PEG Window         MGMT          CD Mag PEG Milestone                        ne           0.0                0.0                 12/4/2006    4      1/8/2007    9     1/29/2007   12     1/29/2007   12
CD Mag PEG a              MGMT          Complete Mag PEG Milestone                  bf           0.0      0.0       0.0                  1/8/2007    9      1/1/2007    8     1/22/2007   11     1/29/2007   12
INTEG Mag UI FW           MGMT          Complete Mag UI PEG Milestone               bf           0.0      0.0       0.0                  2/5/2007   13     1/22/2007   11     2/12/2007   14      3/5/2007   17
INTEG Mag UI FW           MGMT          Test Mag UI TES On Target                   ne          10.0                0.0                 3/12/2007   18     3/12/2007   18      4/2/2007   21     5/21/2007   28
INTEG Mag UI FW           MGMT          Test Mag UI TES On Target Milestone         ne           0.0                0.0                 3/12/2007   18     3/12/2007   18      4/2/2007   21     5/21/2007   28
INTEG Mag UI FW           MGMT          Test Mag UI TES On Target                   bf          10.0      0.0       0.0                 3/19/2007   19      3/5/2007   17     3/26/2007   20      7/2/2007   34

                                              TASKS COMPLETED IN WEEK 11
CD Mag PEG Frame        PM              CD Mag UI FW Frame Phase1 PM       bf                    0.0      0.4       0.0       0.00      12/4/2006    4    12/18/2006     6                       1/23/2007   11
Mag General             MGMT            Mag Initialization                 bf                    0.0      1.9       0.0       0.00                        12/18/2006     6                       1/22/2007   11
Mag General             DLDR            Mag Initialization DLDR phase 1    bf                    0.0      0.0       0.0                 12/4/2006    4    12/18/2006     6                       1/23/2007   11
CD Mag PEG Frame        PM              CD Mag UI FW Frame Phase2 PM       bf                    0.3      0.0       0.0                 12/4/2006    4    12/18/2006     6                       1/23/2007   11
CD Mag UI FW Flow       PM              CD Mag FW Flow PM                  bf                    0.1      0.1       0.0       0.98      12/4/2006    4    12/18/2006     6                       1/23/2007   11
CD Mag UI FW Flow                                                          bf                    0.0      0.0       0.0                12/18/2006    6    12/18/2006     6                       1/23/2007   11
CD Mag PEG Window Builder ODEINSP
                        C               CD Mag PEG Window Builder CODEINSP nm                    0.3      1.0       0.0       0.25      12/4/2006    4      1/1/2007     8                       1/25/2007   11


                                                                                                                                                    Earned Value for Software Development
                                                                                                                                                    SPIN 08-Sept 2010                                             48

                                                                                                                                                    © 2008-09 Carnegie Mellon University
Form Week -2

 The top of form Week displays a summary of current status for any week
 up to the current week.


TSP Week Summary - Form WEEK
                Name Consolidation                                                                            Date           3/1/2007
                Team Voyager
       Status for Week         11             Selected Assembly                                              Cycle
           Week Date     1/22/2007   SYSTEM
                                                                                                 Plan /    Plan -
  Task Hours %Change                                           Weekly Data     Plan     Actual   Actual    Actual          Project End Dates
              Baseline      660.8              Schedule hours for this week      51.0     48.1     1.06        2.9         Baseline     3/19/2007
               Current      745.5          Schedule hours this cycle to date   344.0     395.0     0.87      -51.0            Plan      3/26/2007
             %Change       12.8%                 Earned value for this week       5.6      0.7     8.10        4.9        Predicted     7/30/2007
                                             Earned value this cycle to date     43.8     22.0     1.99       21.8
                                          To-date hours for tasks completed    163.9     314.5     0.52
                                            To-date average hours per week       31.3     35.9     0.87
                                         EV per completed task hour to date    0.134     0.070




                                                                                                    Earned Value for Software Development
                                                                                                    SPIN 08-Sept 2010                               49

                                                                                                    © 2008-09 Carnegie Mellon University
Form Week -3

   The bottom half of Form Week displays the status of open milestones,
   tasks completed in the selected week, and tasks due in the next two
   weeks.


                                                                                                                        Baseline or
                                                                           Re-     Plan    Actual   EV or             Committed Date     Plan Date and       Slip Date and     Predicted Date
      Assembly           Phase                       Task                 source   Hrs.     Hrs.     PV      CPI         and Week            Week                Week            and Week

                                          OPEN MILESTONES
CD Mag PEG Window      MGMT         CD Mag PEG Milestone                  ne         0.0               0.0              12/4/2006    4      1/8/2007    9     1/29/2007   12     1/29/2007        12
CD Mag PEG a           MGMT         Complete Mag PEG Milestone            bf         0.0      0.0      0.0               1/8/2007    9      1/1/2007    8     1/22/2007   11     1/29/2007        12
INTEG Mag UI FW        MGMT         Complete Mag UI PEG Milestone         bf         0.0      0.0      0.0               2/5/2007   13     1/22/2007   11     2/12/2007   14      3/5/2007        17
INTEG Mag UI FW        MGMT         Test Mag UI TES On Target             ne        10.0               0.0              3/12/2007   18     3/12/2007   18      4/2/2007   21     5/21/2007        28
INTEG Mag UI FW        MGMT         Test Mag UI TES On Target Milestone   ne         0.0               0.0              3/12/2007   18     3/12/2007   18      4/2/2007   21     5/21/2007        28
INTEG Mag UI FW        MGMT         Test Mag UI TES On Target             bf        10.0      0.0      0.0              3/19/2007   19      3/5/2007   17     3/26/2007   20      7/2/2007        34

                                          TASKS COMPLETED IN WEEK 11
CD Mag PEG Frame        PM          CD Mag UI FW Frame Phase1 PM       bf            0.0      0.4      0.0     0.00     12/4/2006   4     12/18/2006     6                       1/23/2007        11
Mag General             MGMT        Mag Initialization                 bf            0.0      1.9      0.0     0.00                       12/18/2006     6                       1/22/2007        11
Mag General             DLDR        Mag Initialization DLDR phase 1    bf            0.0      0.0      0.0              12/4/2006   4     12/18/2006     6                       1/23/2007        11
CD Mag PEG Frame        PM          CD Mag UI FW Frame Phase2 PM       bf            0.3      0.0      0.0              12/4/2006   4     12/18/2006     6                       1/23/2007        11
CD Mag UI FW Flow       PM          CD Mag FW Flow PM                  bf            0.1      0.1      0.0     0.98     12/4/2006   4     12/18/2006     6                       1/23/2007        11
CD Mag UI FW Flow                                                      bf            0.0      0.0      0.0             12/18/2006   6     12/18/2006     6                       1/23/2007        11
CD Mag PEG Window Builder ODEINSP
                        C           CD Mag PEG Window Builder CODEINSP nm            0.3      1.0      0.0     0.25     12/4/2006   4       1/1/2007     8                       1/25/2007        11




                                                                                                                               Earned Value for Software Development
                                                                                                                               SPIN 08-Sept 2010                                             50

                                                                                                                               © 2008-09 Carnegie Mellon University
Schedule Management -1

TSP teams routinely meet their schedule commitments.
They use earned value management, task hour management, and quality
management at the team and personal level to help manage schedule.




                                                                             Teams monitor earned value per week
                                                                             and cumulative earned value for the cycle


                    TSP Week Summary - Form WEEK
                                   Name Consolidation                                                                        Date      3/1/2007
                                   Team Voyager
                          Status for Week         11             Selected Assembly                                          Cycle
                              Week Date     1/22/2007   SYSTEM
                                                                                                                  Plan /   Plan -
                      Task Hours %Change                                           Weekly Data    Plan   Actual   Actual   Actual     Project End Dates
                                                                   Earned value for this week      5.6     0.7     8.10      4.9      Baseline     3/19/2007
                                                               Earned value this cycle to date    43.8    22.0     1.99     21.8         Plan      3/26/2007
                                                            To-date hours for tasks completed    163.9   314.5     0.52              Predicted     7/30/2007




                                                                                                     Earned Value for Software Development
                                                                                                     SPIN 08-Sept 2010                                         51

                                                                                                     © 2008-09 Carnegie Mellon University
Schedule Management -2

Intuit’s 2007 release of QuickBooks met every major milestone and
delivered 33% more functionality than planned.
First-time TSP projects at Microsoft had a 10 times better mean
schedule error than non-TSP projects at Microsoft as reflected in the
following table.

Microsoft Schedule Results    Non-TSP Projects               TSP Projects
Released on Time              42%                            66%
Average Days Late             25                             6
Mean Schedule Error           10%                            1%
Sample Size                   80                             15




 Source: Microsoft


                                                    Earned Value for Software Development
                                                    SPIN 08-Sept 2010                       52

                                                    © 2008-09 Carnegie Mellon University
Reporting to Higher Level Management: Script STATUS


The team presents status to
management and the customer at
specified intervals.
    •   weekly, bi-weekly, monthly
Project status
    •   earned value and projection
    •   task hours
    •   milestones planned/completed
    •   product quality indicators
Project risks
    •   status of existing risks
    •   newly-identified risks


                                       Earned Value for Software Development
                                       SPIN 08-Sept 2010                       53

                                       © 2008-09 Carnegie Mellon University
Questions




            ?


                Earned Value for Software Development
                SPIN 08-Sept 2010                       54

                © 2008-09 Carnegie Mellon University
Contact Information

Jim McHale – jdm@sei.cmu.edu
Bill Nichols – wrn@sei.cmu.edu




                                 Earned Value for Software Development
                                 SPIN 08-Sept 2010                       55

                                 © 2008-09 Carnegie Mellon University
Earned Value for Software Development
SPIN 08-Sept 2010                       56

© 2008-09 Carnegie Mellon University

Contenu connexe

Similaire à Pitt spin-sept-10-ev-in-sw-projects-psp

Puneet Samaiya+6yr+Manual Testing+IBM PUNE
Puneet Samaiya+6yr+Manual Testing+IBM PUNEPuneet Samaiya+6yr+Manual Testing+IBM PUNE
Puneet Samaiya+6yr+Manual Testing+IBM PUNE
Puneet Samaiya
 
Cv Somnath Paramanick Nw Ep
Cv Somnath Paramanick Nw EpCv Somnath Paramanick Nw Ep
Cv Somnath Paramanick Nw Ep
Somnath Destiny
 
Rajeshwari K A 9+ years as Java Developer and Team lead-1
Rajeshwari K A 9+ years as Java Developer and Team lead-1Rajeshwari K A 9+ years as Java Developer and Team lead-1
Rajeshwari K A 9+ years as Java Developer and Team lead-1
Rajeshwari KA
 
CV_HoangLeAnhTuan
CV_HoangLeAnhTuanCV_HoangLeAnhTuan
CV_HoangLeAnhTuan
Tuan Hoang
 
AvneetSingh_Resume
AvneetSingh_ResumeAvneetSingh_Resume
AvneetSingh_Resume
Avneet Singh
 
LatestResume
LatestResumeLatestResume
LatestResume
Vimal Raj
 
jacob_gulliver_resume_july2016
jacob_gulliver_resume_july2016jacob_gulliver_resume_july2016
jacob_gulliver_resume_july2016
Jacob Gulliver
 

Similaire à Pitt spin-sept-10-ev-in-sw-projects-psp (20)

Dinu Baby CV 6Y.pdf
Dinu Baby CV 6Y.pdfDinu Baby CV 6Y.pdf
Dinu Baby CV 6Y.pdf
 
Technical debt
Technical debtTechnical debt
Technical debt
 
Yauheni_Semchanka_CV
Yauheni_Semchanka_CVYauheni_Semchanka_CV
Yauheni_Semchanka_CV
 
Integrating Quality into Portfolio Management
Integrating Quality into Portfolio Management Integrating Quality into Portfolio Management
Integrating Quality into Portfolio Management
 
Zhiyu-CV
Zhiyu-CVZhiyu-CV
Zhiyu-CV
 
sunny singhal
sunny singhalsunny singhal
sunny singhal
 
Splunk Discovery Köln - 17-01-2020 - Splunk for ITOps
Splunk Discovery Köln - 17-01-2020 - Splunk for ITOpsSplunk Discovery Köln - 17-01-2020 - Splunk for ITOps
Splunk Discovery Köln - 17-01-2020 - Splunk for ITOps
 
Todd_Emelo
Todd_EmeloTodd_Emelo
Todd_Emelo
 
Michael Coté - The Eternal Recurrence of DevOps
Michael Coté - The Eternal Recurrence of DevOpsMichael Coté - The Eternal Recurrence of DevOps
Michael Coté - The Eternal Recurrence of DevOps
 
Mainframes project
Mainframes projectMainframes project
Mainframes project
 
Madhab_CV
Madhab_CVMadhab_CV
Madhab_CV
 
Puneet Samaiya+6yr+Manual Testing+IBM PUNE
Puneet Samaiya+6yr+Manual Testing+IBM PUNEPuneet Samaiya+6yr+Manual Testing+IBM PUNE
Puneet Samaiya+6yr+Manual Testing+IBM PUNE
 
Resume_nakri
Resume_nakriResume_nakri
Resume_nakri
 
Cv Somnath Paramanick Nw Ep
Cv Somnath Paramanick Nw EpCv Somnath Paramanick Nw Ep
Cv Somnath Paramanick Nw Ep
 
Rajeshwari K A 9+ years as Java Developer and Team lead-1
Rajeshwari K A 9+ years as Java Developer and Team lead-1Rajeshwari K A 9+ years as Java Developer and Team lead-1
Rajeshwari K A 9+ years as Java Developer and Team lead-1
 
CV_HoangLeAnhTuan
CV_HoangLeAnhTuanCV_HoangLeAnhTuan
CV_HoangLeAnhTuan
 
AvneetSingh_Resume
AvneetSingh_ResumeAvneetSingh_Resume
AvneetSingh_Resume
 
Continuous Integration and Continuous Deployment
Continuous Integration and Continuous DeploymentContinuous Integration and Continuous Deployment
Continuous Integration and Continuous Deployment
 
LatestResume
LatestResumeLatestResume
LatestResume
 
jacob_gulliver_resume_july2016
jacob_gulliver_resume_july2016jacob_gulliver_resume_july2016
jacob_gulliver_resume_july2016
 

Dernier

CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
giselly40
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
Joaquim Jorge
 

Dernier (20)

Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your Business
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 

Pitt spin-sept-10-ev-in-sw-projects-psp

  • 1. Earned Value for Software Development Is NOT a Myth! Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 William Nichols and James McHale September 2010 © 2010 Carnegie Mellon University
  • 2. Agenda Why EV won’t Work (for Software Development Projects) How to Make EV work Pulling it all Together (TSP) Earned Value for Software Development SPIN 08-Sept 2010 2 © 2008-09 Carnegie Mellon University
  • 3. Why EV won’t work? Earned Value for Software Development SPIN 08-Sept 2010 3 © 2008-09 Carnegie Mellon University
  • 4. Why EV won’t work for software Software development work is hard to estimate with sufficient accuracy. Earned Value for Software Development SPIN 08-Sept 2010 4 © 2008-09 Carnegie Mellon University
  • 5. Why EV won’t work for software Software project work is hard to estimate with sufficient accuracy. You can’t get an accurate progress report from a software developer. Earned Value for Software Development SPIN 08-Sept 2010 5 © 2008-09 Carnegie Mellon University
  • 6. Why EV won’t work for software Software development work is hard to estimate with sufficient accuracy. You can’t get an accurate progress report from a software developer. Until the software tests successfully we don’t know that we are done. Earned Value for Software Development SPIN 08-Sept 2010 6 © 2008-09 Carnegie Mellon University
  • 7. Why EV won’t work for software Software development work is hard to estimate with sufficient accuracy. You can’t get an accurate progress report from a software developer. Until the software tests successfully we don’t know that we are done. Software is iterative, work is revised a number of times before complete Earned Value for Software Development SPIN 08-Sept 2010 7 © 2008-09 Carnegie Mellon University
  • 8. Why EV won’t work for software Software development work is hard to estimate with sufficient accuracy. You can’t get an accurate progress report from a software developer. Until the software tests successfully we don’t know that we are done. Software is iterative, work is revised a number of times before complete Software project have imprecise requirements, scope isn’t fixed Earned Value for Software Development SPIN 08-Sept 2010 8 © 2008-09 Carnegie Mellon University
  • 9. So what do you do? The 95% rule for ETC? Developers are always 95% done, Just ask them. So how much time remains? Earned Value for Software Development SPIN 08-Sept 2010 9 © 2008-09 Carnegie Mellon University
  • 10. So what do you do? The 95% rule for ETC? It takes “95% of the estimated schedule/cost to complete 95% of the work and ANOTHER 95% TO Finish it!” Earned Value for Software Development SPIN 08-Sept 2010 10 © 2008-09 Carnegie Mellon University
  • 11. So what do you do? The 90% rule for ETC? It takes “90% of the estimated schedule/cost to complete 90% of the work and ANOTHER 90% TO Finish it!” DON’T do this! Earned Value for Software Development SPIN 08-Sept 2010 11 © 2008-09 Carnegie Mellon University
  • 12. How to make EV work? Earned Value for Software Development SPIN 08-Sept 2010 12 © 2008-09 Carnegie Mellon University
  • 13. The Four Core Requirements For Earned Value† A credible schedule of the planned work A time phased budget for the planned work A means of collecting progress to plan of the work performed A means of collecting cost information for the work performed † The Earned Value Management Maturity Model®, Ray W. Stratton, Management Concepts, 2006. Earned Value for Software Development SPIN 08-Sept 2010 13 © 2008-09 Carnegie Mellon University
  • 14. Software development work is hard to estimate with sufficient accuracy. Earned Value for Software Development SPIN 08-Sept 2010 14 © 2008-09 Carnegie Mellon University
  • 15. Estimate Accurately Software development work is hard to estimate accurately? Start by defining “what done looks like” Decompose the work into work packages. Learn how to estimate a work package. Use history as a guide Earned Value for Software Development SPIN 08-Sept 2010 15 © 2008-09 Carnegie Mellon University
  • 16. Ex: Text Pages versus Writing Time 120 100 80 Time (hours) y = 2.4366x + 4.1297 2 60 R = 0.6094 40 20 0 0 10 20 30 40 50 Text pages Earned Value for Software Development SPIN 08-Sept 2010 16 © 2008-09 Carnegie Mellon University
  • 17. Ex: LOC versus Development Time 6000 5000 4000 Time (min.) 3000 2000 y = 6.4537x - 252.94 2 1000 R = 0.9582 0 0 200 400 600 800 -1000 C++ LOC Earned Value for Software Development SPIN 08-Sept 2010 17 © 2008-09 Carnegie Mellon University
  • 18. PSP Estimating Accuracy 40 PSP 0 Majority are under-estimating 20 0 -200% -100% 0% 100% 40 Balance of over-estimates and under- PSP 1 estimates 20 0 -200% -100% 0% 100% 40 Much tighter balance around zero PSP 2 20 0 -200% -100% 0% 100% Effort Estimation Accuracy Earned Value for Software Development SPIN 08-Sept 2010 18 © 2008-09 Carnegie Mellon University
  • 19. Improving Estimating Accuracy Estimated Minutes - Actual Minutes / Estimated Minutes Effort Estimation Accuracy Trend 0.7 PSP0 PSP1 PSP2 0.6 0.5 Mean Time Misestimation PSP Level Average 0.4 0.3 298 developers 0.2 0 1 2 3 4 5 6 7 8 9 10 11 Program Number Earned Value for Software Development SPIN 08-Sept 2010 19 © 2008-09 Carnegie Mellon University
  • 20. You can’t get an accurate progress report from a software developer? Earned Value for Software Development SPIN 08-Sept 2010 20 © 2008-09 Carnegie Mellon University
  • 21. You can’t get an accurate progress report from a software developer? Make a credible plan, and track to the plan. Break the work package into smaller steps. For each step, now what DONE looks like. Have a standard. Use history to estimate the effort in each step. Earned Value for Software Development SPIN 08-Sept 2010 21 © 2008-09 Carnegie Mellon University
  • 22. You can’t get an accurate progress report from a software developer? Make a credible plan, and track to the plan. Break the work package into smaller steps. For each step, now what DONE looks like. Have a standard. Use history to estimate the effort in each step. Earned Value for Software Development SPIN 08-Sept 2010 22 © 2008-09 Carnegie Mellon University
  • 23. How do you break a work package into steps? Use a process! Plan Build Personal review Team inspection Test Field use Earned Value for Software Development SPIN 08-Sept 2010 23 © 2008-09 Carnegie Mellon University
  • 24. What is a Process? A process is a defined and measured set of steps for doing a job. A process guides your work. A process is usually defined for a job that is done multiple times. A process provides a foundation for planning. • A process is a template, a generic set of steps. • A plan is a set of steps for a specific job, plus other information such as effort, costs, and dates. Earned Value for Software Development SPIN 08-Sept 2010 24 © 2008-09 Carnegie Mellon University
  • 25. TSP Process Elements Phase Purpose To guide you in developing module-level programs Inputs Required Problem description PSP project plan summary form Time and defect recording logs Scripts Document the process entry criteria, phases/ Defect type standard Stop watch (optional) 1 Planning - Produce or obtain a requirements statement. - Estimate the required development time. - Enter the plan data in the project plan summary form. - Complete the time log. 2 Development - Design the program. - Implement the design. - Compile the program and fix and log all defects found. - Test the program and fix and log all defects found. - Complete the time recording log. steps, and exit criteria. The purpose is to 3 Postmortem Exit Criteria Complete the project plan summary form with actual time, defect, and size data. - A thoroughly tested program - Completed project plan summary with estimated and actual data provide expert-level guidance as you use the - Completed defect and time logs process. Measures Measure the process and the product. They Student Program Instructor Date Program # Language provide insight into how the process is Summary LOC/Hour Actual Time Planned Time CPI(Cost-Performance Index) % Reuse % New Reuse Plan Actual To Date (Actual/Planned) working and the status of the work. Test Defects/KLOC Total Defects/KLOC Yield % % Appraisal COQ % Failure COQ COQ A/F Ratio Program Size (LOC): Plan Actual To Date Base(B) (Measured) (Measured) Deleted (D) (Estimated) (Counted) Modified (M) (Estimated) (Counted) Added (A) Forms Provide a convenient and consistent (N-M) (T-B+D-R) Reused (R) (Estimated) (Counted) Total New & Changed (N) (Estimated) (A+M) Total LOC (T) (N+B-M-D+R) (Measured) Total New Reused (Estimated) (Counted) Total Object LOC (E) framework for gathering and retaining (Estimated) (Counted) Upper Prediction Interval (70%) Lower Prediction Interval (70%) Time in Phase (min.) Plan Actual To Date To Date % Planning Design data Design review Code Code review Compile Test Postmortem Total Total Time UPI (70%) Total Time LPI (70%) Provide consistent definitions that Standards guide the work and gathering of data. Earned Value for Software Development SPIN 08-Sept 2010 25 © 2008-09 Carnegie Mellon University
  • 26. Example Process Script - Requirements Earned Value for Software Development SPIN 08-Sept 2010 26 © 2008-09 Carnegie Mellon University
  • 27. Until the software tests successfully we don’t know that we are done? Then you are done when the test complete successfully! But, test is highly variable. Then make a quality plan. Plan to remove the defects you’ve put in. Defects can result in • incorrect functionality • poor operation • improper installation • confusing or incorrect documentation • error-prone modification and enhancement Earned Value for Software Development SPIN 08-Sept 2010 27 © 2008-09 Carnegie Mellon University
  • 28. Why Focus on Quality? The fastest way to finish is to do it right the first time! Do it right, or do it over Completed tasks should be verified with performance measures. This links to TPM, Technical Performance Measures Earned Value for Software Development SPIN 08-Sept 2010 28 © 2008-09 Carnegie Mellon University
  • 29. Why Focus on Defects? In most engineering organizations, a significant number of resources are dedicated to fixing defects. Often more than 40% of cost and schedule. Defects are very costly. It is beneficial to find and remove defects early in the process. The reasons for managing defects are to • produce better products • improve your ability to develop products on time and within budget Earned Value for Software Development SPIN 08-Sept 2010 29 © 2008-09 Carnegie Mellon University
  • 30. Quality Measures The TSP uses three quality measures for planning and tracking. 1. Defect injection and removal rates 2. Defect density 3. Review rates Earned Value for Software Development SPIN 08-Sept 2010 30 © 2008-09 Carnegie Mellon University
  • 31. Defect Injection and Removal Rates Defect Injection Rate is defined as the number of defects injected per hour while performing activities in a process phase. Defect Removal Rate is defined as the number of defects removed per hour while performing activities in a process phase. Typical defect injection phases include requirements and design. Typical defect removal phases include reviews, inspections, and testing. Earned Value for Software Development SPIN 08-Sept 2010 31 © 2008-09 Carnegie Mellon University
  • 32. Defect Density Defect Density is the ratio of the number of defects removed and the product size. A common defect density measure in software projects is number of defects found per thousand lines of code (defects/KLOC). An example of another defect density measure, used by the SEI when developing training slides, is number of defects found per slide. Defect density is also a good product planning, tracking, and predictive measure. Earned Value for Software Development SPIN 08-Sept 2010 32 © 2008-09 Carnegie Mellon University
  • 33. Review Rates Review rates is the ratio between the size of a product and the time spent in reviewing or inspecting the product. A common example in software projects is lines of code reviewed per hour (LOC/hr). Another example is number of requirements pages reviewed per hour (Req Pages/hr). Review rate is the control variable for inspections and reviews. It is used to • allocate appropriate time during planning • predict the effectiveness of the review or inspection Earned Value for Software Development SPIN 08-Sept 2010 33 © 2008-09 Carnegie Mellon University
  • 34. Defect Injection and Removal In your process, you will have activities that • inject defects Plan • remove defects Defect injection typically occurs when you Build • determine the job requirements Personal • specify the product review • build the product Team Defect removal typically occurs when you inspection • review the products Test • test the products • use the products Field use Earned Value for Software Development SPIN 08-Sept 2010 34 © 2008-09 Carnegie Mellon University
  • 35. Defect Removal Techniques Reviews (inspections, walkthroughs, personal reviews) • examine the product or interim development artifacts of the product • find and eliminate defects Testing • exercises the product or parts of the product • proves that the product works correctly • identifies potential defects or symptoms In many cases, projects rely on testing to find and fix defects. When this happens, many defects are found in the field by the customer. Earned Value for Software Development SPIN 08-Sept 2010 35 © 2008-09 Carnegie Mellon University
  • 36. PSP Quality Results Defects Per KLOC Removed in Compile and Test 120 110 100 Mean Number of Defects Per 90 Mean Compile + Test 80 70 PSP2 PSP Level Mean Comp + Test 60 50 PSP1 40 30 PSP0 KLOC 20 10 298 developers 0 0 1 2 3 4 5 6 7 8 9 10 11 Program Number Earned Value for Software Development SPIN 08-Sept 2010 36 © 2008-09 Carnegie Mellon University
  • 37. The ‘Worst’ as Good as the ‘Best’? Compile and Test Defects - from PSP Training 810 developers 250 200 Defects/KLOC 1st Quartile 150 2nd Quartile 100 3rd Quartile 4th Quartile 50 Defect 0 reduction 1Q: 80.4% 10 1 2 3 4 5 6 7 8 9 2Q: 79.0% og og og og og og og og og og Pr Pr Pr Pr Pr Pr Pr Pr Pr 3Q: 78.5% Pr 4Q: 77.6% PSP Assignment Number Earned Value for Software Development SPIN 08-Sept 2010 37 © 2008-09 Carnegie Mellon University
  • 38. Software is iterative, work is revised a number of times before complete? Earned Value for Software Development SPIN 08-Sept 2010 38 © 2008-09 Carnegie Mellon University
  • 39. Software is iterative, work is revised a number of times before complete? Incremental and iterative development isn’t a bug, it’s a feature! DoD 5000.02's procurement cycle, incremental and iterative approaches are used. This is a fact on almost any project. (ever change a leaky faucit?) Apply the appropriate method to deal with this.. Earned Value for Software Development SPIN 08-Sept 2010 39 © 2008-09 Carnegie Mellon University
  • 40. TSP Cyclic Development Strategy TSP favors an iterative or cyclic development strategy. Launch • develop in increments • use multiple cycles Re-launch • work-ahead Projects can start on any phase or cycle. Development phase or cycle Each phase or cycle starts with a launch or re-launch. Phase or cycle Postmortem TSP permits whatever process structure makes the most business and technical sense. Project Postmortem Earned Value for Software Development SPIN 08-Sept 2010 40 © 2008-09 Carnegie Mellon University
  • 41. Core Success Criteria for Earned Value and Software Development Define the outcomes of the work effort in some tangible way. Define the way progress is going to be measured. 0/100% for each work task. Either it's done or it's not done, there is no “we’re trying real hard”. Define the tangible evidence, the date the tangible evidence is expected, and the associated costs. . Earned Value for Software Development SPIN 08-Sept 2010 41 © 2008-09 Carnegie Mellon University
  • 42. Software project have imprecise requirements, scope isn’t fixed Earned Value for Software Development SPIN 08-Sept 2010 42 © 2008-09 Carnegie Mellon University
  • 43. Use a Planning Process Apply the appropriate method to deal with this. Usually most portions of the project can be planned. A common SW Dev project mistake is staffing up too soon! Earned Value for Software Development SPIN 08-Sept 2010 43 © 2008-09 Carnegie Mellon University
  • 44. TSP: Pulling it all together Earned Value for Software Development SPIN 08-Sept 2010 44 © 2008-09 Carnegie Mellon University
  • 45. Plan Execution -1 Tasks in personal task plans are ordered according to team priorities. Developers adjust the order as needed and work ahead as needed. Developers select a task planned for current week and begin tracking time. Earned Value for Software Development SPIN 08-Sept 2010 45 © 2008-09 Carnegie Mellon University
  • 46. Plan Execution -2 While working they also record When they are done they • any defects they find • stop time tracking • any risks they identify • mark task completed if done • record size if the process step calls for it • any process improvement ideas Earned Value for Software Development SPIN 08-Sept 2010 46 © 2008-09 Carnegie Mellon University
  • 47. Monitor and Control the Plan: Script WEEK Team members meet each week to Performance Data Reviewed assess progress. • Baseline Plan Value • Plan Value • Role managers present their • Earned Value evaluation of the team’s plan and • Predicted Earned Value data. • Earned Value Trend • Plan Task Hours • Goal owners present status on • Actual Task Hours product and business objectives. • Tasks/Milestones completed • Tasks/Milestones past due • Risk owners present status on risk • Tasks/Milestones next 2 weeks mitigation plans and new risks. • Effort against incomplete tasks • Estimation Accuracy • Team members present status on • Review and Inspection Rates their plans. • Injection Rates • Removal Rates Plan deviations are addressed each • Time in Phase Ratios • Phase and Process Yield week. • Defect Density • Quality Profile (QP) Significant deviations like new • QP Index requirements trigger a replan. • Percent Defect Free • Defect Removal Profile • Plan to Actual Defects Injected/Removed Earned Value for Software Development SPIN 08-Sept 2010 47 © 2008-09 Carnegie Mellon University
  • 48. Form Week -1 Teams use Form Week to review status at their weekly meetings. TSP Week Summary - Form WEEK Name Consolidation Date 3/1/2007 Team Voyager Status for Week 11 Selected Assembly Cycle Week Date 1/22/2007 SYSTEM Plan / Plan - Task Hours %Change Weekly Data Plan Actual Actual Actual Project End Dates Baseline 660.8 Schedule hours for this week 51.0 48.1 1.06 2.9 Baseline 3/19/2007 Current 745.5 Schedule hours this cycle to date 344.0 395.0 0.87 -51.0 Plan 3/26/2007 %Change 12.8% Earned value for this week 5.6 0.7 8.10 4.9 Predicted 7/30/2007 Earned value this cycle to date 43.8 22.0 1.99 21.8 To-date hours for tasks completed 163.9 314.5 0.52 To-date average hours per week 31.3 35.9 0.87 EV per completed task hour to date 0.134 0.070 Baseline or Re- Plan Actual EV or Committed Date Plan Date and Slip Date and Predicted Date Assembly Phase Task source Hrs. Hrs. PV CPI and Week Week Week and Week OPEN MILESTONES CD Mag PEG Window MGMT CD Mag PEG Milestone ne 0.0 0.0 12/4/2006 4 1/8/2007 9 1/29/2007 12 1/29/2007 12 CD Mag PEG a MGMT Complete Mag PEG Milestone bf 0.0 0.0 0.0 1/8/2007 9 1/1/2007 8 1/22/2007 11 1/29/2007 12 INTEG Mag UI FW MGMT Complete Mag UI PEG Milestone bf 0.0 0.0 0.0 2/5/2007 13 1/22/2007 11 2/12/2007 14 3/5/2007 17 INTEG Mag UI FW MGMT Test Mag UI TES On Target ne 10.0 0.0 3/12/2007 18 3/12/2007 18 4/2/2007 21 5/21/2007 28 INTEG Mag UI FW MGMT Test Mag UI TES On Target Milestone ne 0.0 0.0 3/12/2007 18 3/12/2007 18 4/2/2007 21 5/21/2007 28 INTEG Mag UI FW MGMT Test Mag UI TES On Target bf 10.0 0.0 0.0 3/19/2007 19 3/5/2007 17 3/26/2007 20 7/2/2007 34 TASKS COMPLETED IN WEEK 11 CD Mag PEG Frame PM CD Mag UI FW Frame Phase1 PM bf 0.0 0.4 0.0 0.00 12/4/2006 4 12/18/2006 6 1/23/2007 11 Mag General MGMT Mag Initialization bf 0.0 1.9 0.0 0.00 12/18/2006 6 1/22/2007 11 Mag General DLDR Mag Initialization DLDR phase 1 bf 0.0 0.0 0.0 12/4/2006 4 12/18/2006 6 1/23/2007 11 CD Mag PEG Frame PM CD Mag UI FW Frame Phase2 PM bf 0.3 0.0 0.0 12/4/2006 4 12/18/2006 6 1/23/2007 11 CD Mag UI FW Flow PM CD Mag FW Flow PM bf 0.1 0.1 0.0 0.98 12/4/2006 4 12/18/2006 6 1/23/2007 11 CD Mag UI FW Flow bf 0.0 0.0 0.0 12/18/2006 6 12/18/2006 6 1/23/2007 11 CD Mag PEG Window Builder ODEINSP C CD Mag PEG Window Builder CODEINSP nm 0.3 1.0 0.0 0.25 12/4/2006 4 1/1/2007 8 1/25/2007 11 Earned Value for Software Development SPIN 08-Sept 2010 48 © 2008-09 Carnegie Mellon University
  • 49. Form Week -2 The top of form Week displays a summary of current status for any week up to the current week. TSP Week Summary - Form WEEK Name Consolidation Date 3/1/2007 Team Voyager Status for Week 11 Selected Assembly Cycle Week Date 1/22/2007 SYSTEM Plan / Plan - Task Hours %Change Weekly Data Plan Actual Actual Actual Project End Dates Baseline 660.8 Schedule hours for this week 51.0 48.1 1.06 2.9 Baseline 3/19/2007 Current 745.5 Schedule hours this cycle to date 344.0 395.0 0.87 -51.0 Plan 3/26/2007 %Change 12.8% Earned value for this week 5.6 0.7 8.10 4.9 Predicted 7/30/2007 Earned value this cycle to date 43.8 22.0 1.99 21.8 To-date hours for tasks completed 163.9 314.5 0.52 To-date average hours per week 31.3 35.9 0.87 EV per completed task hour to date 0.134 0.070 Earned Value for Software Development SPIN 08-Sept 2010 49 © 2008-09 Carnegie Mellon University
  • 50. Form Week -3 The bottom half of Form Week displays the status of open milestones, tasks completed in the selected week, and tasks due in the next two weeks. Baseline or Re- Plan Actual EV or Committed Date Plan Date and Slip Date and Predicted Date Assembly Phase Task source Hrs. Hrs. PV CPI and Week Week Week and Week OPEN MILESTONES CD Mag PEG Window MGMT CD Mag PEG Milestone ne 0.0 0.0 12/4/2006 4 1/8/2007 9 1/29/2007 12 1/29/2007 12 CD Mag PEG a MGMT Complete Mag PEG Milestone bf 0.0 0.0 0.0 1/8/2007 9 1/1/2007 8 1/22/2007 11 1/29/2007 12 INTEG Mag UI FW MGMT Complete Mag UI PEG Milestone bf 0.0 0.0 0.0 2/5/2007 13 1/22/2007 11 2/12/2007 14 3/5/2007 17 INTEG Mag UI FW MGMT Test Mag UI TES On Target ne 10.0 0.0 3/12/2007 18 3/12/2007 18 4/2/2007 21 5/21/2007 28 INTEG Mag UI FW MGMT Test Mag UI TES On Target Milestone ne 0.0 0.0 3/12/2007 18 3/12/2007 18 4/2/2007 21 5/21/2007 28 INTEG Mag UI FW MGMT Test Mag UI TES On Target bf 10.0 0.0 0.0 3/19/2007 19 3/5/2007 17 3/26/2007 20 7/2/2007 34 TASKS COMPLETED IN WEEK 11 CD Mag PEG Frame PM CD Mag UI FW Frame Phase1 PM bf 0.0 0.4 0.0 0.00 12/4/2006 4 12/18/2006 6 1/23/2007 11 Mag General MGMT Mag Initialization bf 0.0 1.9 0.0 0.00 12/18/2006 6 1/22/2007 11 Mag General DLDR Mag Initialization DLDR phase 1 bf 0.0 0.0 0.0 12/4/2006 4 12/18/2006 6 1/23/2007 11 CD Mag PEG Frame PM CD Mag UI FW Frame Phase2 PM bf 0.3 0.0 0.0 12/4/2006 4 12/18/2006 6 1/23/2007 11 CD Mag UI FW Flow PM CD Mag FW Flow PM bf 0.1 0.1 0.0 0.98 12/4/2006 4 12/18/2006 6 1/23/2007 11 CD Mag UI FW Flow bf 0.0 0.0 0.0 12/18/2006 6 12/18/2006 6 1/23/2007 11 CD Mag PEG Window Builder ODEINSP C CD Mag PEG Window Builder CODEINSP nm 0.3 1.0 0.0 0.25 12/4/2006 4 1/1/2007 8 1/25/2007 11 Earned Value for Software Development SPIN 08-Sept 2010 50 © 2008-09 Carnegie Mellon University
  • 51. Schedule Management -1 TSP teams routinely meet their schedule commitments. They use earned value management, task hour management, and quality management at the team and personal level to help manage schedule. Teams monitor earned value per week and cumulative earned value for the cycle TSP Week Summary - Form WEEK Name Consolidation Date 3/1/2007 Team Voyager Status for Week 11 Selected Assembly Cycle Week Date 1/22/2007 SYSTEM Plan / Plan - Task Hours %Change Weekly Data Plan Actual Actual Actual Project End Dates Earned value for this week 5.6 0.7 8.10 4.9 Baseline 3/19/2007 Earned value this cycle to date 43.8 22.0 1.99 21.8 Plan 3/26/2007 To-date hours for tasks completed 163.9 314.5 0.52 Predicted 7/30/2007 Earned Value for Software Development SPIN 08-Sept 2010 51 © 2008-09 Carnegie Mellon University
  • 52. Schedule Management -2 Intuit’s 2007 release of QuickBooks met every major milestone and delivered 33% more functionality than planned. First-time TSP projects at Microsoft had a 10 times better mean schedule error than non-TSP projects at Microsoft as reflected in the following table. Microsoft Schedule Results Non-TSP Projects TSP Projects Released on Time 42% 66% Average Days Late 25 6 Mean Schedule Error 10% 1% Sample Size 80 15 Source: Microsoft Earned Value for Software Development SPIN 08-Sept 2010 52 © 2008-09 Carnegie Mellon University
  • 53. Reporting to Higher Level Management: Script STATUS The team presents status to management and the customer at specified intervals. • weekly, bi-weekly, monthly Project status • earned value and projection • task hours • milestones planned/completed • product quality indicators Project risks • status of existing risks • newly-identified risks Earned Value for Software Development SPIN 08-Sept 2010 53 © 2008-09 Carnegie Mellon University
  • 54. Questions ? Earned Value for Software Development SPIN 08-Sept 2010 54 © 2008-09 Carnegie Mellon University
  • 55. Contact Information Jim McHale – jdm@sei.cmu.edu Bill Nichols – wrn@sei.cmu.edu Earned Value for Software Development SPIN 08-Sept 2010 55 © 2008-09 Carnegie Mellon University
  • 56. Earned Value for Software Development SPIN 08-Sept 2010 56 © 2008-09 Carnegie Mellon University