Contenu connexe
Similaire à [StepTalks2011] Team Software Process (TSP): High Performance Individuals, High Performance Teams - Alan Willett (20)
Plus de Strongstep - Innovation in software quality (20)
[StepTalks2011] Team Software Process (TSP): High Performance Individuals, High Performance Teams - Alan Willett
- 3. Team Software Process:
High Performance Individuals
High Performance Teams
Alan Willett
Software Engineering Institute
Carnegie Mellon University
Pittsburgh, PA 15213
E-mai: awillett@sei.cmu.edu
Phone: 001 607-592-7279
March 2011
© 2011 Carnegie Mellon University
- 4. Frustrated Executive
Executive needs
• New feature set to meet increasing demands of customers
• On time delivery
• High quality software deliverables to lower the cost of maintenance
Executive wish list
• Improved architecture to enable more product lines, new markets,
and more revenue
• Higher productivity
Team Software Process
4
© 2011 Carnegie Mellon University
- 5. Software Industry Project Performance
Standish group 2009 Chaos Many software projects fail to meet
report.2000-2008 Project Performance key technical and business objectives.
Only 30% of projects satisfy their
planned cost, schedule, and feature
commitments.
Nearly 50% are Challenged
Failed
• 43% average cost overrun
• 83% average schedule overrun
Challenged
• Only 52% of planned features are
completed
Successful
0
10
20
30
40
50
Team Software Process
5
© 2011 Carnegie Mellon University
- 6. Leading Causes of Project Failure
Leading causes of project failure are:
• Unrealistic schedules, staffing, and plans
• Changing or misunderstood requirements
• Inadequate or poorly implemented architecture
• Test-in quality
• Undisciplined practice
Team Software Process
6
© 2011 Carnegie Mellon University
- 7. Why?
Management Expectations Developer Skill Set
“I want to see the most aggressive Results from over 3000 developers
schedule possible!” writing a controlled sample set of
programs.
- On average remove 40
“Hurry up and get it to test so we defects/KLOC in Unit Test.
can start finding the defects.” Delivering an expected 10-20
defects/KLOC to system test.
“Never slip a date until the day Estimation Error
you miss it.”
Team Software Process
7
© 2011 Carnegie Mellon University
- 8. SEI Engineering Solutions
Integrates and Leverages SEI Technologies
Team Rapid Architecture
Six Sigma
CMMI SCAMPI Software Deployment Centric
toolkit
Process Strategy Engineering
Team Software Process
8
© 2011 Carnegie Mellon University
- 9. Case Study
Background:
• Bolsa Mexicana de Valores (BMV)
operates the Mexican financial markets
under license from the federal government.
• Bursatec is the technology arm of the
BMV.
• BMV desired a new trading engine to
replace the existing stock market engine
and integrate the options and futures
markets.
• The BMV performed a build vs. buy
analysis, and decided to replace their three
existing trading engines with one in-house
developed system.
Team Software Process
9
© 2011 Carnegie Mellon University
- 10. The Project
Bursatec committed to deliver a trading engine in
8-10 quarters:
• High performance (as fast or faster than anything out there)
• Reliable and of high quality (the market cannot go down)
• Scalable (able to handle both spikes and long-term growth in trading
volume)
Complicating factors:
• Pressure – managers replaced when commitments are not met
• Inexperience – available staff talented but young
• Large project – beyond the organization’s recent experience
• Key implementation technologies never used together formally
• Constant stream of new requirements/changes to business rules
Team Software Process
10
© 2011 Carnegie Mellon University
- 11. TSP+ACE: Summary of Operational Best Practices
Team Software Process Architecture-Centric Engineering
TSP is a disciplined, agile, process for software The discipline of using architecture as the focal
development teams. TSP builds teams that build
point for performing ongoing analyses to assure
high-quality products at lower cost while meeting
planned costs and schedule. systems will support their missions.
• Self-managed team • Quality Attribute Workshop
management model
• Business Thread Workshop
• TSP metrics framework
• Attribute-Driven Design
• Team Launch process
• Views and Beyond
• Estimating method (PROBE)
• Quality management model • Architecture Trade-off
Analysis Method
• Personal Software Process
• Active Reviews for
• Project-based deployment
strategy Intermediate Design
Team Software Process
11
© 2011 Carnegie Mellon University
- 12. Personal Software Process (PSP) Training
Average Defects/KLOC Removed in Test
70
60
50
C++
Def/KLOC
40 C
C#
30 Java
VB
20
10
0
1 2 3 4 5 6 7 8 9 10
Program Number
Team Software Process
12
© 2011 Carnegie Mellon University
- 13. The TSP Launch Process
1. Establish
4. Build Top- 7. Conduct 9. Hold
Product and
down and Risk Management
Business
Next-Phase Assessment Review
Goals
Plans
8. Prepare
2. Assign Roles 5. Develop
Management Launch
and Define the Quality
Briefing and Postmortem
Team Goals Plan
Launch Report
3. Produce
6. Build Bottom- The TSP launch process produces necessary
Development
up and planning artifacts, e.g. goals, roles,
Strategy
Consolidated estimates, task plan, milestones, quality plan,
Plans risk mitigation plan, etc.
The most important outcome is a committed
team.
Team Software Process
13
© 2011 Carnegie Mellon University
- 14. Transparency
Cumulative Earned Value
Visible artifacts from every team, 100.0
90.0
Cumulative Planned and Actual Hours per Week
every six weeks. 80.0
70.0
2500.0 Planned and Actual Hours per Week
200.0 Earned Value
Earned Value
60.0 2000.0
Cumulative Planned Value
180.0
On-going reviews of all artifacts 50.0 Cumulative EV
Cummulative Hours
14.0
1500.0 160.0 Cumulative Predicted Earned Value
40.0
140.0 12.0 Cumulative Planned Hours
ensures technical quality.
30.0
Cumulative Actual Hours
120.0
20.0 1000.0 10.0
Hours
Planned Hours
100.0
10.0
Earned Value
Actual Hours
80.0 8.0 Planned Value
0.0 500.0
Project status Earned Value
2/24/2003 4/7/2003
5/5/2003
12/2/2002
1/13/2003
1/27/2003
2/10/2003
2/24/2003
3/10/2003
2/10/20033/24/2003
3/10/20034/21/2003
12/16/2002
12/30/2002
60.0
6.0 Predicted Earned Value
40.0
0.0
4.0 Percent Defect Free
4/7/2003
5/5/2003
12/2/2002
1/13/2003
1/27/2003
3/24/2003
4/21/2003
12/16/2002
12/30/2002
20.0
Weeks
• To-date effort and schedule 100.0% 0.0 2.0
4/7/2003
5/5/2003
12/2/2002
1/13/2003
1/27/2003
2/10/2003
2/24/2003
3/10/2003
3/24/2003
4/21/2003
12/16/2002
12/30/2002
Defects Removed by Phase for Assembly SYSTEM
90.0% Weeks
0.0
4/7/2003
5/5/2003
12/2/2002
1/13/2003
1/27/2003
2/10/2003
2/24/2003
3/10/2003
3/24/2003
4/21/2003
12/16/2002
12/30/2002
Predicted effort and schedule
80.0%
•
500.0
Weeks
Defect Density by Phase for Assembly SYSTEM
70.0% 450.0
Percent Defect Free
Defects Removed by Phase
400.0
60.0% Weeks
350.0 25.00
• Resource utilization 50.0%
40.0%
300.0
250.0 20.00
Cumulative Defects Removed by Phase for Assembly SYSTEM
Plan
1400 Actual
Cumulative Defects Removed by Phase
200.0
Defects/KLOC
30.0%
• Process and plan fidelity 20.0%
150.0
100.0
15.00 1200 Plan
Actual
10.00 1000
10.0% 50.0
0.0 800
• Pre-test, post-test, and release quality 0.0%
5.00
Plan
n
st
le
t
n
n
n
e
st
w
w
es
ti o
tio
tio
Actual
tio
Te
od
In Compile In Unit Test In Build and In System Test In Acceptance Test In Product Life
pi
ie
Te
ie
600
tT
ec
om
ec
ec
ev
ec
C
ev
n
em
sp
ni
Integration Test
io
sp
sp
sp
R
R
C
U
at
In
D
In
In
In
st
de
gr
w DL
Sy
EQ
0.00 400
LD
LD
e
w Co
t T Inte
Phase
od
R
H
D
C
Cycle and phase entry and exit criteria
d
•
st
le
t
n
n
st
an
es
tio
tio
Te
pi
ie
Te
ie
200
om
ec
ev
ec
i ld
ev
n
em
ni
io
sp
Bu
sp
R
R
C
U
at
LD
In
In
st
e
gr
od
Sy
0
LD
n de
D
de te
Phase
Te nts C
tio o
od Co In
D
ec C
d
n
n
n
t
Bue t
sp n
H Tes n
ng
sp le
i o st
on
In Uni n
es
st
w
en
w
an
ti o
In Pla
la
ig
LD sig
tio
pi
gr t Te
ie
Te
ie
i ld ti
ni
Sy n T
P
pm
es
e
ec
om
c
De ev
ec
e
ev
m
an
t
EQ st
em
D
D
D el o
sp
sp
R
re
R
C
Pl
d
g r vel
at
In
Sy qui
In
In
st
e
le
v
n
em
ai
Le
LD
LD
io
e
e
D
te
Phase
od
et
R
C
at
h-
st
st
R
D
C
ig
Te
d
te
H
an
In
i ld
Bu
Phase
Team Software Process
14
© 2011 Carnegie Mellon University
- 15. Experience and Results
The ACE methods and architecture coaching,
coupled with the discipline of the TSP, built a
competent architecture team quickly.
The project objectives were met.
• Schedule – finished early
• Quality – early trials and quality metrics suggest that
reliability and quality goals were met.
• Performance – a day’s worth of transactions can be
processed in seconds.
• Total test effort was less than 15% of total
development effort.
• Architecture development effort was less than 15%
of total development effort.
Team Software Process
15
© 2011 Carnegie Mellon University
- 16. AIM and Case Study #2
SEI developed AIM (Accelerated Improvement Method) to meet the goals and
challenges of
• Rapid achievement of maturity levels
• High performance projects
• Rapid return on investment
Team Rapid
CMMI SCAMPI Software Deployment
Process Strategy
Accelerated Improvement Method
Team Software Process
16
© 2011 Carnegie Mellon University