2. What and Why We have to do it No one likes it We’re always wrong Real money and time is at stake
3. References A Review of Surveys on Software Effort Estimation KjetilMoløkken and MagneJørgensen Better sure than safe? Over-confidence in judgment based software development effort prediction intervals MagneJørgensen, Karl HalvorTeigen, and KjetilMoløkken
4. How Expert based methods Expert consultation Intuition and experience Analogy Model based (Software Cost Models) COCOMO Use-Case-based estimation FPA-metrics or other algorithm driven methods Other Price-to-win Capacity related Top-down Bottom-up
5. Results Expert estimation most frequently used method No evidence that the use of formal methods (on average) lead to more accurate estimate Cost overrun more common than schedule overrun Average cost overrun of 30-40%
6. Results Accuracy (according to one study) If cost overrun (34%) Over budget: 61% Under budget: 10% If schedule overrun (22%) Completed after schedule: 65% Completed before schedule: 4%
7. Results Prediction intervals (estimate min/max) In one study, students provided better prediction intervals than “experts”. “The software professional may feel a pressure to indicate high development skills through narrow prediction intervals”
8. Blame Cost overruns Over-optimistic estimates Changes in design or implementation Schedule overruns Optimistic planning Frequent changes in specification Frequent requests for changes by users Users’ lack of understanding of their own requirements Other (not just bad estimation)