1 in 20 IT projects deliver on time and satisfy business management.
That means 95% of IT projects are partial or total failures.
This presentation explores the causes, and potential strategies for delivering a project in the 5% bracket.
3. 1 in 20 IT projects
deliver on time
and satisfy
business
management.
Of IT projects…
6 in 10 fail to come in on time.
Half fail to meet budget.
Half cost more to maintain then planned.
4 in 10 fail to deliver business value.
July 17, 2012 http://www.adamcole.ca Slide 3
4. In a single year…
Almost 25% of IT projects are cancelled
Cancelled projects cost $67B
Non-cancelled projects had cost overruns of $21B
80% of budgets go towards fixing flaws
…not counting post-release support and patch management
July 17, 2012 http://www.adamcole.ca Slide 4
5. Standish Group: “60% of the IT projects conducted in 2010 failed to
deliver on their goals because they either came in late or over
budget, or they had fewer features than were originally specified”
Global IT
investment in
2012 will total
>$2T
July 17, 2012 http://www.adamcole.ca Slide 5
6. Software estimating is universally abysmal!
The bad news:
Poor estimates harm your business and your integrity.
The good news:
You are not alone.
There are many opportunities for improvement.
July 17, 2012 http://www.adamcole.ca Slide 6
7. Why is this the case?
Good estimating is labor intensive…
July 17, 2012 http://www.adamcole.ca Slide 7
8. …and there is often pressure to estimate
within budget and schedule constraints;
However…
July 17, 2012 http://www.adamcole.ca Slide 8
9. At project conception there is a need for
an off-the-cuff estimate
Even with the usual caveats, once this estimate
is provided expectations are set which are
difficult to change.
Budgets and schedules are built on this estimate
Bids are prepared on this estimate
Customer/stakeholder expectations are set on this
estimate
Business decisions are made based on this estimate
July 17, 2012 http://www.adamcole.ca Slide 9
10. The trap:
Key requirements are in conflict
July 17, 2012 http://www.adamcole.ca Slide 10
11. Project estimation errors are Pervasive
Estimation errors are not constrained to IT
projects
July 17, 2012 http://www.adamcole.ca Slide 11
12. Canadian Gun Registry
500x original estimate
(That’s 50,000% over budget!)
July 17, 2012 http://www.adamcole.ca Slide 12
13. Boston’s Big Dig
$20B over original estimate
July 17, 2012 http://www.adamcole.ca Slide 13
14. Britain & France’s “Chunnel”
$8B over original estimate
July 17, 2012 http://www.adamcole.ca Slide 14
15. Sydney Opera House
10 years late, 14x over budget
July 17, 2012 http://www.adamcole.ca Slide 15
16. US War in Iraq
$2,950B over original budget
Yes, that is a $3T estimating error!
Several years behind schedule
No possibility of meeting original objectives
July 17, 2012 http://www.adamcole.ca Slide 16
17. Project estimation errors are common
UK’s National Health Services NPfIT (>$25B over
original estimate)
London’ Stock Exchange Taurus project (132x over
original estimate)
Irish Health Services PPARS (17x over original
estimate)
Montreal’s Olympic Stadium (12x over budget)
Vancouver’s 2010 Winter Olympics (7x over budget)
The project manager’s nightmare list:
http://en.wikipedia.org/wiki/Cost_overrun
July 17, 2012 http://www.adamcole.ca Slide 17
18. Top level lessons:
Estimating is tough, especially accurate
estimating.
Estimating takes much time, effort, and
involvement from all stakeholders.
Estimating is iterative.
Accuracy improves as project progresses.
Conversely, accuracy is quite poor prior to
detailed requirements analysis and sign-off.
July 17, 2012 http://www.adamcole.ca Slide 18
19. Watts S. Humphrey, Fellow at Carnegie Mellon
University’s SEI (Software Engineering Institute)
cconducted a multiyear study of 13,000 programs:
100~150 errors per 1000 lines of code (LOC)
Each iteration of testing identifies <50%
After 4 iterations of testing ~5 errors per 1000 LOC
Each testing iteration is expensive. Companies
rarely would do 4x
July 17, 2012 http://www.adamcole.ca Slide 19
20. Lack of education and training. Many people don't
know how to estimate, have no training in estimation,
and receive no feedback on their estimates to help
them improve. A developer who knows how to write
code well doesn't necessarily know how to estimate.
Confusion of the desired schedule/effort target with the
estimate. Development teams are frequently pushed into
dates because of business needs rather than a rational plan
to deliver on those dates.
Hope-based planning. Developers know the "right answer":
that the project is on time and on budget, based on the
marketing-or management-assigned target.
July 17, 2012 http://www.adamcole.ca Slide 20
21. Credibility. Software personnel's inability to credibly communicate
and support their estimates. The lack of good estimation processes
and knowledge frequently leads to being pushed into the "shortest
schedule you can't prove you won't make" instead of a rational
schedule based on probable outcomes.
Scope creep. Incomplete, changing, and creeping requirements.
Nothing harms a fixed-price and fixed-schedule project more
than changing and growing requirements.
Quality surprises (“Haste makes waste”). Projects can easily spend half
of their time in the test-and-fix phase, especially when the need for
speed causes the development team to take additional risks and turn
over inadequately tested code.
July 17, 2012 http://www.adamcole.ca Slide 21
22. Question:
How is the size of a project
related to the effort to
complete the project?
July 17, 2012 http://www.adamcole.ca Slide 22
23. Intuition Reality
Project effort is Project effort Project effort grows
proportional to project diminishes as project exponentially as the
size. This is what most size grows. This follows project size increases
people intuitively the law of scales of
believe. economy and is the
thinking in a
manufacturing
environment.
July 17, 2012 http://www.colecollective.com/abc Slide 23
24. Intuitively we expect economies of scale.
This is not true for software projects.
Why?
As size becomes larger, complexity
increases.
As size becomes larger, a greater
number of tasks need to be
completed and coordinated.
As size becomes larger, there are
more team members increasing the
As the size of software communications and PM challenges.
Since fixed costs for software projects
development project becomes is minimal, there are little if any
larger the cost per function economies of scale.
point actually rises. …Complexity kills!
July 17, 2012 http://www.colecollective.com/abc Slide 24
25. “Nearly 75% of projects over 8,000 function
points will fail and/or be canceled.”
In IT we have the luxury of being able to break projects up into
modules and stitch them together into a cohesive whole
Better accuracy in estimating
Better probability of successful outcome
Better containment of problems/risks
Better reusability (development) / fallback positions (infrastructure)
Reusability driven development
July 17, 2012 http://www.colecollective.com/abc Slide 25
26. Question:
With respect to a schedule
estimate to complete a task,
what does the probability
distribution look like?
July 17, 2012 http://www.colecollective.com/abc Slide 26
27. Intuition Reality
Single point estimate. Normal distribution. A long-tail distribution
Not really an estimate The task is as likely to reflects the reality that:
at all. Typically this is come in ahead of a) the task cannot be
the sign of a target schedule as it is to completed before a
masquerading as an come in behind certain date, but
estimate. schedule. b) there is potential
that the task could
take much longer
than anticipated.
July 17, 2012 http://www.adamcole.ca Slide 27
28. If a task is estimated to take three weeks, Intuition
50%
(rough
the best case scenario is it is done instantly. estimate) probability
However the worst case scenario is not 6
weeks, but much longer.
Under ideal circumstances it is unlikely
that the task can be done instantly, and even
1 or 2 weeks is highly unlikely, yet
there is reasonable probability that the task
will take 5, 6, or more weeks.
The mean probability of the task completion date is to the
right (pessimistic) side of the hump.
Factoring for all pessimistic outcomes is the only way to
achieve accurate estimates.
July 17, 2012 http://www.adamcole.ca Slide 28
29. This graph illustrates…
1. Estimating is highly
optimistic (vertical axis
variance)
2. Estimating accuracy
improves as CMM level
increases (horizontal)
3. Overestimating is
never a problem.
4. Estimation accuracy
improves as the
project progresses.
July 17, 2012 http://www.adamcole.ca Slide 29
30. McConnell provides a simple questionnaire to
demonstrate that we are routinely overconfident
in our estimates.
Provide a range for the following with 90% confidence:
▪ Surface temperature of the sun?
▪ Latitude of Shanghai?
▪ Area of the Asian continent?
Did you provide a wide enough range to score 90%?
In the complete 10 question survey, the average score
is 28% - that is, ¾ of you will estimate outside of the
90% target range.
Lesson: “Avoid using artificially narrow ranges”
July 17, 2012 http://www.adamcole.ca Slide 30
31. The Penalty for underestimating is more severe than the
penalty for overestimating.
Penalties for overestimating:
Parkinson’s Law states that work will expand to fill the
available time.
Penalties for underestimating:
Reduced effectiveness of project plans.
Statistically reduced chance of on-time completion.
Poor technical foundation, leads to worse-than-nominal
results.
Destructive late-project dynamics make the project worse
than nominal.
Aim for accuracy, but err on the side of overestimation.
July 17, 2012 http://www.adamcole.ca Slide 31
32. “Most estimation errors occur before you estimate.”
“You have the most information when you’re doing
something, not before you’ve done it.”
An early estimate is a guess. It is often influenced by
political and competitive pressures.
July 17, 2012 http://www.adamcole.ca Slide 32
33. Michael Bolton presents this thought experiment…
Imagine that you have a project, and that, for
estimation’s sake, you broke it down into really fine-
grained detail. The entire project decomposes into
100 tasks, such that you figured that each task would
take one hour. That means that your project should
take 100 hours.
Suppose also that you estimated extremely
conservatively, such that half of the tasks (that is, 50)
were accomplished in half an hour, instead of an hour.
Let’s call these Stunning Successes. 35% of the tasks
are on time; we’ll called them Regular Tasks.
July 17, 2012 http://www.adamcole.ca Slide 33
34. 15% of the time, you encounter some bad luck...
8 tasks, instead of taking an hour, take two hours. Let’s call those Little Slips.
4 tasks (one in 25) end up taking four hours, instead of the hour you thought
they’d take. There’s a bug in some library that you’re calling; you need access
to a particular server and the IT guys are overextended so they don’t call back
until after lunch. We’ll call them Wasted Mornings.
2 tasks (one in fifty) take a whole day, instead of an hour. Someone has to stay
home to mind a sick kid. Those we’ll call Lost Days.
1 task in a hundred—just one—takes two days instead of just an hour. A library
developed by another team is a couple of days late; a hard drive crash takes
down a system and it turns out there’s a Post-It note jammed in the backup
tape drive; one of the programmers has her wisdom teeth removed (all these
things have happened on projects that I’ve worked on). These don’t have the
devastating impact of a Black Swan; they’re like baby Black Swans, so let’s call
them Black Cygnets.
July 17, 2012 http://www.adamcole.ca Slide 34
35. # of Tasks Outcome Coefficient Total hours
50 Stunning success 0.5 25
35 On time 1.0 35
8 A little slip 2.0 16
4 Wasted morning 4.0 16
2 Lost day 8.0 16
1 Black Cygnet 16.0 16
100 124
50 tasks were stunning successes relative to 1 baby black
swan; yet, the project still came in 25% behind schedule!
July 17, 2012 http://www.adamcole.ca Slide 35
37. Total hours
Coefficient
# of Tasks
Outcome
20 Stunning 0.9 18
success
65 On time 1 65
8 A little 2 16
slip
4 Wasted 4 16
morning
2 Lost day 8 16
1 Black 16 16
Cygnet
100 147
July 17, 2012 http://www.adamcole.ca Slide 37
38. Tendencies Do this instead
Optimistic estimating Pragmatic / pessimistic estimating
Under estimate Over estimate
Fit estimate to business constraints Ignore constraints while estimating
One time estimate Refine estimate with specifications
July 17, 2012 http://www.adamcole.ca Slide 38
39. Modular: break the project into discrete components. Each component is
independent and self-maintained. The components become black-box inputs to
the master project; however, no single modules exceeds ‘n’ function points /
‘n’%. No single module failure dooms the entire project. (Use strategies to
prevent/correct problem modules. E.g. use ‘A’ team.)
Iterative (n cycle): Microsoft Word has been improved over many iterations
(versions). Do the same. Plan a roadmap so that each version introduces no
more than ‘n’ function points.
Iterative (1 cycle): Some projects cannot be delivered over many subsequent
versions (e.g. The Big Dig). In such a case use traditional/Agile iterative
development. Plan a formal/complete release cycle monthly. Many (potential)
problems will be discovered earlier.
Simplify: This should be obvious but never is. Pride/ego/expectations/
committees/whathaveyou encourages us to build more than necessary. (Note
that this includes much more than gold-plating.) “Perfection [in design] is
achieved, not when there is nothing more to add, but when there is nothing left
to take away.”3 (Antoine de Saint-Exupéry). Review specifications with a critical
eye and incentives for elimination of features.
July 17, 2012 39
40. Add resources: Add more team members, but recognize
this follows the law of diminishing returns. Warning: There
are cases, especially late in the project where this may yield
no positive impact and may even be detrimental.
Improve caliber of resources: replace inexperienced
team members with experienced team members.
Wideband Delphi: (also good at exposing risks). With WBD,
experts anonymously complete a form with their estimates.
They then reveal and discuss their gross differences.
Iteratively repeat until convergence.
Planning Poker: Similar to WBD. Aimed at Agile teams and
coders as opposed to expert estimators. Geometric growth
in card values counters WAGs for larger estimates.
A study found that estimates obtained through the Planning Poker process
were less optimistic and more accurate.
July 17, 2012 40
41. 2~3 estimations using different approaches
Require all estimates to be justified. Gut
feeling is not an adequate justification.
Don't use methods or tools blindly. Try
estimating previous (completed) projects to
validate and tune the methods.
Educate your estimators. Knowing how to do something doesn't
mean you know how long it will take. Train people in estimation.
Accuracy is correlated with training and the ability to see results,
not development experience.
Perfection is the enemy of good enough. Make sure the range is
appropriately wide. (Narrowing the ranges produces a more
accurate estimate but is more costly.)
July 17, 2012 http://www.adamcole.ca Slide 41
42. Involve the right (read: experienced) people.
Split the project into self-contained,
manageable sub-projects (<$100K).
Strive for reusability at all levels.
Know the objectives, feel the pain, involve
a senior management champion.
Always communicate estimates as a range rather than a single
value – to convey uncertainty
Eliminate complexity: reduce (features), reuse (team members),
recycle (code)
July 17, 2012 http://www.adamcole.ca Slide 42
43. Estimation Confidence Worksheet No So/so Yes
Team: experience working with…
• each other
• the processes and tools
• this environment/culture
• similar projects
Objective: the project/objective…
• is clear and unambiguous
• has management champion and support
• budget and schedule are open to influence from the estimate
• is conventional; minimal creativity and invention required
Complexity: the project…
• can be completed by one team in one major module/iteration
• impacts one department; not multiple departments
• is not a result of or causes a change to business strategy/operations
• estimation is not influenced by a competitive bidding process
Total: (count the checks in the columns above)
Adjusted total: (multiple No*4, So/so*2, Yes*1) x4 x2 X1
Factor (%): (multiple the adjusted total by 10) x10
July 17, 2012 http://www.adamcole.ca Slide 43
44. People-Related Process-Related Product-Related Technology-Related
1. Undermined motivation 14. Overly optimistic schedules 28. Requirements gold-plating 33. Silver-bullet syndrome
15. Insufficient risk 34. Overestimated savings from
2. Weak personnel 29. Feature creep
management new tools or methods
3. Uncontrolled problem 35. Switching tools in the
16. Contractor failure 30. Developer gold-plating
employees middle of a project
31. Push me, pull me 36. Lack of automated source-
4. Heroics 17. Insufficient planning
negotiation code control
5. Adding people to a late 18. Abandonment of planning 32. Research-oriented
project under pressure development
19. Wasted time during the
6. Noisy, crowded offices
fuzzy front end
7. Friction between 20. Shortchanged upstream
developers and customers activities
8. Unrealistic expectations 21. Inadequate design
9. Lack of effective project 22. Shortchanged quality
sponsorship assurance
10. Lack of stakeholder buy- 23. Insufficient management
in controls
24. Premature or too frequent
11. Lack of user input
convergence
12. Politics placed over 25. Omitting necessary tasks
substance from estimates
13. Wishful thinking 26. Planning to catch up later
27. Code-like-hell programming
July 17, 2012 http://www.adamcole.ca Slide 44
45. McConnell adds two new classic mistakes for 2008:
Confusing estimates with targets (process)
Excessive multi-tasking (people)
Although not explicitly identified, sub-optimal
communications is in my experience a classic mistake.
Additional common project threats:
Overly ambitious project objectives. (K.I.S.S.)
Release when done. (Agile: release early, release often)
#2 “Weak personnel” can be exploded into many subtopics:
▪ “weak” can mean technically weak, inexperienced, lack of business
savvy, poor communicator, etc.
▪ The benefits of a few excellent team members outweigh many
moderate team members.
Read: http://www.stevemcconnell.com/rdenum.htm for descriptions of the 36 classic mistakes.
Read Jeff Atwood’s witty take on the classic 36 plus hundreds of blog responses:
http://www.stevemcconnell.com/rdenum.htm
July 17, 2012 http://www.adamcole.ca Slide 45
46. Communications,
the silent killer.
July 17, 2012 http://www.adamcole.ca Slide 46
47. I relied on a number of resources while researching this presentation. Most significantly is Steve McConnell’s
excellent book, Software Estimation – if you are/will be working on a medium to large project (IT or otherwise)
I highly recommend it. I particularly found the following resources quite useful:
General:
http://www.stevemcconnell.com
http://www.softwaremetrics.com
http://www.objectwatch.com
Specific:
Software’s Top 10 classic mistakes for 2008. (8 of the top 10 from
1996 remain in the top 10 for 2008!)
The Risks Digest. (An enormous catalog of hard, often macabre
lessons (mostly) related to IT project failures.)
Project Wipe-out: Big Failures. (The name says it all. Also contains
a number of useful links on learning from failures.)
Other great resources:
http://www.softwaremetrics.com/Articles/estimating.htm
http://flylib.com/books/en/3.314.1.73/1/
July 17, 2012 http://www.adamcole.ca Slide 47
48. 1. Longstreet, D. (2008). Estimating Software Projects. Retrieved January 3, 2009 from Software Metrics:
http://www.softwaremetrics.com/Articles/estimatingdata.htm
2. Sessions, R. (January 2008). The Top Seven Mistakes CIOs Will Make in 2008. Retrieved January 3, 2009
from Object Watch: http://www.objectwatch.com/newsletters/ObjectWatchNewsletter-056.pdf
3. Blog entry by ‘Adam’ (May 8, 2007, 4:44am). Your favorite Programming Quote. Retrieved January 3, 2009
from Coding Horror: http://www.codinghorror.com/blog/archives/000855.html
4. McConnell, S. (2006). Software Estimation. Redmond, Washington: Microsoft Press.
5. Neemuchwala, Abid A., “Evolving IT from ‘Running the Business’ to ‘Changing the Business’”, Tata
Consultancy Services, North America, Aug. 2007.
6. Mann, Charles C., “Why Software is so Bad”, Technology Review (www.technologyreview.com),
July/August 2002.
Initial Presentation: January 2009
July 17, 2012 http://www.adamcole.ca Slide 48
This presentation is a foray into the world of project estimating. It can be used to help educate management, clients, and team members. Many excellent additional resources are referenced. (In particular I find Steve McConnell’s book, Software Estimation indispensible.) I hesitate to say this presentation is specific to software projects. The challenges are the same across all genres. Also, most software projects (at least those that I am involved in) are business solutions – and should be considered as such!Original: Jan 2009
The Dilbert cartoon, a requisite at the start of any self-respecting IT presentation.
These numbers vary widely. As of Feb 15, 2010, Wikipedia (http://en.wikipedia.org/wiki/Cost_overrun): 9 out of 10 infrastructure/building/IT projects have overruns, 50~100% is common overrun amount (20 nations, 5 continents); Standish Group 2004 study of IT projects: 43% avg overrun, 71% of projects fail to meet budget, time, *AND* scope, waste estimated at US$55B/yr in US alone.Reference: Neemuchwala, Abid A., p3 August 2007 study by Dynamic Markets Limited Reference: Neemuchwala, Abid A., p5 Info-Tech Research Group
2000 Standish Group StudyReference: Mann, Chareles C., p36
Graph: http://www.swqual.com/verification_validation.htmlSuccessful means on-time, on-budget, and with all features and functions as defined in the initial scope; challenged means late, over budget, and/or with less features and functions than defined in the initial scope; failed means cancelled prior to completion, or delivered but never used. (http://blog.casagrande.la/frederic/chaos/)
The following examples illustrate the urgent requirement for better project planning, management, and reporting tools and techniquesCost overruns are relatively easy to measure on completed projects.Significant schedule overruns often result in cancelled projects – as such we do not see as many really nasty overruns
Canadian Gun Registry: Original estimate $2M, Scrapped at $1B+.http://www.cbc.ca/news/background/guncontrol/http://www.maxwideman.com/papers/boondoggle/intro.htm
Boston’s Big Dig: Original estimate $2.8B, Final estimate $22B. http://en.wikipedia.org/wiki/Big_Dig_(Boston,_Massachusetts)
Britain + France’s Chunnel: Original estimate US$9B. Final budget US$17B. http://www.it-cortex.com/Examples_f.htm
http://en.wikipedia.org/wiki/Sydney_Opera_House
Iraq war: Original estimate according to Donald Rumsfeld $50~60B. Washington Post estimates the total cost to be more than $3T! http://www.washingtonpost.com/wpdyn/content/article/2008/03/07/AR2008030702846.html Cartoon: http://politicalhumor.about.com/od/politicalcartoons/ig/Cartoons-2008/Iraq-War-Birthday-Wish.htm
All of these examples are of high exposure projects planned by expert teams with plenty of experience. What makes you think you can do any better?UK’s NHS NPfIT: Original estimate $12B. As of 2006 $24~29B with much left. http://www.baselinemag.com/c/a/Projects-Management/UK-Dept-of-Health-Prescription-for-Disaster/. Original estimate of US$3.3B. As of 2006 as high as US$29B http://en.wikipedia.org/wiki/National_Programme_for_IT London’s Taurus project: Original estimate GBP$6M. 11 years late at GBP$400~800M with no end in sight. http://www.it-cortex.com/Examples_f.htm Irish HSE PPARS: Original estimate $10.7M. Scrapped after 10 years and a price tag of $180M. http://www.computerworld.com/softwaretopics/software/apps/story/0,10801,105468,00.html Vancouver Winter Olympics http://thesecretsofvancouver.com/wordpress/vancouver-will-pass-montreal-in-olympic-debt/taxes Montreal Stadium http://en.wikipedia.org/wiki/Olympic_Stadium_(Montreal)
Reference: Mann, Charles C., p35Error message source: http://www.goodexperience.com/tib/archives/2006/12/epson_smart_pan.html
Source: http://www.computer.org/portal/web/buildyourcareer/fa006 * Is estimating better in the engineering profession than the IT profession?* Some would say yes, although many of the previous examples suggest accurate estimating is a problem which plagues engineers as much as any discipline.* The reality is estimation problems exist in all projects in all sectors, and are particularly significant when there are novel components.* The engineering toolkit in and of itself does not provide for any greater accuracy in estimates; however…Failed engineering projects tend to have a more direct/visible harm [pictures of broken bridges, etc.]As such, there exists expectations of very rigorous risk management, and SDD (Safety-Driven Design)Which leads to engineers being highly concerned with detailed specificationsPenalties and litigation are a certainty in failed engineering projects
Is the relationship linear (project effort is proportional to size), logarithmic (effort decreases faster with size; scales of economy), or exponential (effort increases faster with size)?
Linear: standard way of thinking. 1 unit in equates to 1 unit out.Logarithmic: Manufacturing mindset. The effort to produce the first Ford Model T was high. Production of unit n+1 is nominal.Exponential: This is reality as evidenced by many studies and explained in following slides. Understanding the implications is critical to large project estimating and ultimate success.
Project Effort: is a surrogate for project cost and schedule.Bibliography: Graph and content from: http://www.softwaremetrics.com/Articles/estimatingdata.htm.
Plot a prototype curve representing duration on the horizontal access and probability (of hitting the target date) on the vertical access.
The vertical line represents the mean (50%, 50/50,nominal) outcome. - That is, the 50% probability is to the right of the humpBibliography: Graph and content concepts largely from: Software Estimation
There are limits to how efficient a project can be, but there is no limit to how poorly a project can go.Example: A contractor says it will take 3 weeks to do a minor bathroom renovation. Under the best case scenario the contractor may save a few days, possibly a week. However, there are nearly unlimited reasons which could conspire to cause the job to take longer – and anybody who has had such a job done knows there is no limit to how long it can take.Lesson: The rough estimate will be optimistic.
1. Estimating is highly optimistic (vertical axis variance)2. Estimating accuracy improves as CMM level increases (horizontal)CMM = Capability Maturity Model. A popular mechanism in the early 2000s to evaluate and encourage improvement (maturity) in an IT organization.
See section 2.1 of Software Estimation for the complete questionnaire.Answers: 10Ko F/6Ko C, 31o North, 17.1M sq miles/44.4M sq kmsLesson: Software Estimation, Tip #6
Penalty for over-estimating is the cost of the project will be higher than it could have been. (Project grows to fill available time.) Note that this is equally true with under-estimating. (Each of the penalties for underestimating add cost.)
Source: http://flylib.com/books/en/3.314.1.74/1/, http://jgre.org/2010/05/02/project-estimation/ “You have the most information when you’re doing something, not before you’ve done it.”: Jason Fried and David Heinemeier Hanson, “Planning is guessing”, Rework, page 19Barry Boehm's (1989) research shows that the average estimation error in estimates made before the project's scope and objectives are clarified is +400% to -400%
Source: http://www.developsense.com/blog/2010/10/project-estimation-and-black-swans/Using the example distributions from the Black Swan scenario, and running it through a Monte Carlo simulation, we see that even with good estimation accuracy a few relatively minor anomalies can throw off a number of “stunning successes”. For every 1 on time or early project, 9 are late!
Source: http://www.developsense.com/blog/2010/10/project-estimation-and-black-swans/A more realistic scenario is only 20% of the tasks estimated are stunning successes and 65% are on time estimates; the 15% under-estimates remain the same. (This is still an overall optimistic scenario.)Using this scenario, and running it through a Monte Carlo simulation, we discover that 100% of the projects are late!
http://en.wikipedia.org/wiki/Wideband_delphi
http://en.wikipedia.org/wiki/Wideband_delphihttp://en.wikipedia.org/wiki/Planning_poker#cite_note-2: “Molokken-Ostvold, K. Haugen, N.C. (13 April 2007). "Combining Estimates with Planning Poker--An Empirical Study". IEEE”
Split the project into manageable chunks: “How do you eat an elephant?...” Reusability enforces a mental discipline that whatever I am working on must be standalone. This in turn leads to higher quality and better interfaces. Know the objectives, feel the pain, and involve a senior management champion are all classic alignment strategies. With the doers, the customers, and management aligned and sharing the risk, odds of success increase dramatically.http://www.bain.com/publications/articles/five-keys-to-it-program-success.aspx
The factor, expressed as a percentage (i.e. 120% ~ 480%) provides a “fudge” factor. That is, multiply your estimated effort by this factor to get an approximate upper-bound adjusted for common variables and risk factors.E.g. Let’s say you estimate that building a new widget will take 1000hrs. Then on the worksheet above you get 3 Nos, 5 So-sos, and 4 Yeses. The adjusted totals will respectively be 12, 10, 4. The factor will be 240%; or, a reasonable adjusted upper-bound estimate for building the widget will be 2400hrs.
Regarding sub-optimal communications: Several of the 36 classic mistakes could be rolled up into sub-optimal communications; however, this problem is significant and insidious enough that it should be at the top of any/all PM watch lists. PMI states that ~90% of a PM’s activity should be around communications. Communications is so important that I use “sub-optimal” as the threat. (By the time we get to “poor” [communications] we are already in deep trouble.)With respect to #2 “Weak personnel”: In my experience soft skills far outweigh technical skills. Business-oriented solutions, business/process automations, portals, and everything else that false remotely under the BPM umbrella requires business savvy and strong communications above technical depth. In fact, the best developer I ever knew may have been the most detrimental team member I have ever had. (And he had excellent language and presentation skills.)
Top 10 classic mistakes for 2008: http://blogs.construx.com/blogs/stevemcc/archive/2008/05/13/Software_2700_s-Classic-Mistakes_2D002D00_2008.aspx The Risks Digest: http://catless.ncl.ac.uk/Risks/Project Wipe-out: http://www.bredemeyer.com/Architect/WhitePapers/SoftwareFailures.htm
Web citation bibliography format: AuthorLast, A. (1999, Dec 31). Name of web page. Retrieved Jan 1, 2009, from Name of web site: http://www.url.com