SlideShare une entreprise Scribd logo
1  sur  38
1
CMMI Model Changes
for High Maturity
Herb Weiner
Pat O’Toole
2008 SEPG Conference
Tampa, Florida
2
Problem Statement
High maturity practices are not consistently
understood, applied, or appraised
 SEI is addressing the training and appraisal portions of
the CMMI Product Suite; e.g.,
 Understanding CMMI High Maturity Practices course
 Several recent presentations by SEI personnel
 High Maturity Lead Appraisers certification
 However, there is insufficient foundation for these
“raise-the-floor” interpretations in CMMI v1.2
 Goals do not establish the requirements
 Practices do not establish the expectations
 Informative material purported to take on greater importance.
3
Eating Your Own Dog Food
Requirements Management SG1:
 Requirements are managed and
inconsistencies with project plans and work
products are identified
CMMI Product Suite Management SG1:
 CMMI model requirements are managed and
inconsistencies with CMMI training courses
and appraisal methods are identified.
4
Approach
 Draft proposed changes
 CMMI Model & SCAMPI Method Changes for High Maturity
(Herb Weiner, May 2007)
 Solicit feedback from SEI authorized people via ATLAS
 ATLAS = Ask The Lead AppraiserS
 ATLAS has been expanded to include CMMI instructors
 Candidate lead appraisers and instructors also included
 Publish results to SEI authorized individuals
 Submit CRs to SEI for consideration
 Update model to re-align the CMMI Product Suite.
5
ATLAS Feedback
For each proposed change, respondents indicated:
 Strongly support (It’s perfect!)
 Support (It’s better)
 Are ambivalent (It’s OK either way)
 Disagree (It’s worse)
 Strongly disagree (What were you thinking?)
Ratings were determined on a +1 to -1 scale as follows:
 Strongly support = +1.0
 Support = +0.5
 Ambivalent = 0.0
 Disagree = -0.5
 Strongly disagree = -1.0
For each change, the average rating will be displayed for:
 [High Maturity Lead Appraisers, Other SEI authorized individuals]
6
Proposed
OPP
Changes
7
OPP Proposed Change #1 of 4
Move SP 1.3 to SP 1.1
Current:
SP 1.1 Select Processes
SP 1.2 Establish Process-Performance Measures
SP 1.3 Establish Quality and Process-Performance
Objectives
Proposed:
SP 1.1 Establish Quality and Process-Performance
Objectives
SP 1.2 Select Processes
SP 1.3 Establish Process-Performance Measures
 MA, OPF, and QPM establish objectives in SP 1.1.
(.50, .51)
8
OPP Proposed Change #2 of 4
Revise OPP SP 1.4
Current:
Establish and maintain the organization’s process-
performance baselines.
Proposed:
Conduct process-performance analyses on the selected
processes and subprocesses to verify process stability
and to establish and maintain the organization’s
process-performance baselines.
 SP 1.1 & 1.2 indicate process-performance analysis will
be conducted, but that’s the last we hear of it
 Baselines are established for stable processes
 Elevate this from informative to expected.
(.39, .42)
9
OPP Proposed Change #3 of 4
Revise OPP SP 1.5
Current:
Establish and maintain the process-performance models for the
organization’s set of standard processes.
Proposed:
Establish and maintain models that predict process performance
related to the quality and process-performance objectives.
 The SEI’s new training courses emphasize use of process-
performance models with respect to quantitative objectives
 Focusing this practice on these objectives achieves better alignment
between the model and training.
(.59, .50)
10
OPP Proposed Change #4 of 4
Enhance the informative material
Proposed:
Modify informative material that suggests improving process
performance such as the examples found in OPP SP 1.3 (which
imply that common causes of variation be addressed)
Add new informative material should indicate that, at ML4/CL4,
achieving such improvement might be addressed via OPF and
GP3.1, while at ML5/CL5, it is more likely to be achieved through
CAR, OID, and GP5.2
 In order to delineate level 4 from level 5, the model should avoid
implying that common causes of variation are addressed at level 4
 ML4/CL4: Process stability / execution consistency / special causes
 ML5/CL5: Improving capability / systemic improvement / common causes.
(.36, .44)
11
Proposed
QPM
Changes
12
QPM Proposed Change #1 of 4
Revise QPM SP 1.4
Current:
SP 1.4 Manage Project Performance
Monitor the project to determine whether the project’s objectives for quality
and process performance will be satisfied, and identify corrective action as
appropriate.
Proposed:
SP 1.4 Analyze Project Performance
Analyze the collective performance of the project's subprocesses to
predict whether the project's objectives for quality and process
performance will be satisfied and identify the need for corrective action as
appropriate.
 Fixes mismatch between the current title and practice statement
 Recognizes that project management deals with both quantitatively
managed, and non-quantitatively managed processes.
(.54, .57)
13
QPM Proposed Change #2 of 4
Add QPM SP 1.5
Current: <None>
Proposed:
SP 1.5 Use Process-Performance Models
Use calibrated process-performance models
throughout the life cycle to identify, analyze, and
execute corrective action when necessary.
 Currently, PPMs aren’t expected to be used in QPM
 But use throughout life cycle appears to be expected by SEI
 PPMs may support process or subprocess activities
 Added practice to SG 1, but it could have been added to SG2.
(.39, .46)
14
QPM Proposed Change #3 of 4
Add QPM SP 2.3
Current: <None>
Proposed:
SP 2.3 Address Special Causes of Variation
Identify, address, and prevent reoccurrence of special causes of
variation in the selected subprocesses.
 “Special causes” are featured in SEI materials
 Currently “special causes” are only in QPM’s informative material
 The Glossary definition of “stable process” includes “…and prevent
reoccurrences of special causes”
 Add informative material to ensure that process performance data
and statistical techniques are used appropriately.
(.64, .48)
15
QPM Proposed Change #4 of 4
Revise QPM SP 2.3 (now SP 2.4)
Current:
SP 2.3 Monitor Performance of the Selected Subprocesses
Monitor the performance of the selected subprocesses to
determine their capability to satisfy their quality and process-
performance objectives, and identify corrective action as
necessary.
Proposed:
SP 2.4 Analyze Performance of the Selected Subprocesses
Analyze the performance of the selected subprocesses to predict
their capability to satisfy their quality and process-performance
objectives, and identify and take corrective action as necessary.
 “Analyze” is a much stronger word than “monitor”
 “Predict” is a much stronger word than “determine”
 Emphasize “taking corrective action,” not just identifying it.
(.59, .46)
16
Proposed
CAR
Changes
17
CAR Proposed Change #1 of 7
Thematic Change
 Currently, there is little to suggest that CAR should target
statistically managed subprocesses to identify and analyze
common causes of variation to address:
 Stable processes with unacceptably high standard deviations;
 Stable processes not capable of achieving quality or process
performance objectives; and
 Stable and capable processes that might be improved to enhance
competitive advantage
 Change the focus of CAR’s specific goals and practices from
“defects and other problems” to “problems”
 By collapsing this phrase, model users will not limit their application of
CAR to the subset of problem candidates called “defects”
 Also include a discussion of “opportunities” in the informative material.
(.50, .46)
18
CAR Proposed Change #2 of 7
Revise CAR SG 1
Current:
SG 1 Determine Causes of Defects
Root causes of defects and other problems are
systematically determined.
Proposed:
SG 1 Determine and Analyze Causes
Common causes of variation and root causes of
problems are systematically analyzed.
 Reflects the Thematic Change
 “Analyzed” is a stronger word than “determined”.
(.56, .63)
19
CAR Proposed Change #3 of 7
Revise CAR SP 1.1
Current:
SP 1.1 Select Defect Data for Analysis
Select the defects and other problems for analysis.
Proposed:
SP 1.1 Select Data for Analysis
Select for analysis, using established criteria, quantitatively managed
processes that are candidates for improvement as well as problems
that have a significant effect on quality and process performance.
 Reflects the Thematic Change
 “Significant effect” emphasizes quantitatively managed processes.
(.64, .53)
20
CAR Proposed Change #4 of 7
Revise CAR SP 1.2 and add SP1.3-SP 1.4
Current:
SP 1.2 Analyze Causes
Perform causal analysis of selected defects and other problems and propose
actions to address them.
Proposed:
SP 1.2 Analyze Common Causes
Analyze common causes of variation to understand the inherent quality and
process performance constraints.
SP 1.3 Analyze Root Causes
Perform causal analysis on selected problems to determine their root
causes.
SP 1.4 Propose Actions to Address Causes
Propose actions to address selected common causes of variation and to
prevent recurrence of selected problems.
 Reflects the Thematic Change.
 Establishes expectations for BOTH common causes and root causes.
(.44, .57)
21
CAR Proposed Change #5 of 7
Add CAR SP 1.5
Current: <None>
Proposed:
SP 1.5 Predict Effects of Proposed Actions
Use process performance models and statistical
techniques to predict, in quantitative terms, the effects
of the proposed actions, as appropriate.
 Reflects the SEI’s expected use of PPMs and statistical
methods in high maturity organizations
 Supports proper cost/benefit analysis.
(.52, .58)
22
CAR Proposed Change #6 of 7
Revise CAR SG 2, SP 2.1 – SP 2.2
Current:
SG 2 Analyze Causes
Root causes of defects and other problems are systematically addressed to
prevent their future occurrence.
SP 2.1 Implement the Action Proposals
Implement the selected action proposals that were developed in causal analysis.
SP 2.2 Evaluate the Effect of Changes
Evaluate the effect of changes on process performance.
Proposed:
SG 2 Address Causes
Common causes of variation and root causes of problems are systematically
addressed to quantitatively improve quality and process performance.
SP 2.1 Implement the Action Proposals
Implement selected action proposals that are predicted to achieve a measurable
improvement in quality and process performance.
SP 2.2 Evaluate the Effect of Implemented Actions
Evaluate the effect of implemented actions on quality and process performance.
23
CAR Proposed Change #6 of 7
Proposed: (Copied from previous slide)
SG 2 Address Causes
Common causes of variation and root causes of problems are
systematically addressed to quantitatively improve quality and process
performance.
SP 2.1 Implement the Action Proposals
Implement selected action proposals that are predicted to achieve a
measurable improvement in quality and process performance.
SP 2.2 Evaluate the Effect of Implemented Actions
Evaluate the effect of implemented actions on quality and process
performance.
 Reflects the Thematic Change
 Wording enhanced to focus on measurable improvement of “quality and
process performance” – a phrase reserved for high maturity practices
 SP 2.2 modified to include quality as well as process performance
 A perceived oversight in the current practice.
(.46, .64)
24
CAR Proposed Change #7 of 7
Revise CAR SP 2.3
Current:
SP 2.3 Record Data
Record causal analysis and resolution data for use across the
project and organization.
Proposed:
SP 2.3 Submit Improvement Proposals
Submit process- and technology-improvement proposals based
on implemented actions, as appropriate.
 Proposed practice relies on OID to determine “use across the
project and organization”
 Recognizes that CAR may have been applied locally but the resulting
improvements may be more broadly applicable.
(.48, .41)
25
CAR Proposed Change #8 of 7
CAR is the only high maturity process area with
no lower-level foundation
 OPP – OPD & MA
 QPM – PP, PMC & IPM
 OID – OPF & OPD
Several alternatives were explored via ATLAS:
0. Leave CAR exactly as it is
1. Add “Causal Analysis” PA at ML2
2. Add “Causal Analysis” PA at ML3
3. Add “Causal Analysis” practice to PMC SG2
4. ADD “Issue & Causal Analysis” PA at ML2
5. Add “Causal Analysis” goal to OPF
(-.08,-.19)
(-.45,-.55)
(-.45,-.26)
(+.09,+.16)
(-.55,-.22)
(-.45,-.22)
26
Proposed
OID
Changes
27
OID Proposed Change #1 of 7
Revise OID SG 1
Current:
SG 1 Select Improvements
Process and technology improvements, which contribute to
meeting quality and process-performance objectives, are selected.
Proposed:
SG 1 Select Improvements
Process and technology improvements are identified proactively,
evaluated quantitatively, and selected for deployment based on
their contribution to quality and process performance.
 Somewhat passive vs. very proactive
 Focus on quantitative evaluation and ongoing improvement.
(.66, .63)
28
OID Proposed Change #2 of 7
Revise OID SP 1.1
Current:
SP 1.1 Collect and Analyze Improvement Proposals
Collect and analyze process- and technology-improvement
proposals.
Proposed:
SP 1.1 Solicit Improvement Proposals
Solicit proposals for incremental process and technology
improvements.
 “Solicit” is more proactive than “collect”
 “Analysis” is deferred to SP 1.3 and SP 1.4
 Explicitly targets incremental improvements.
(.66, .43)
29
OID Proposed Change #3 of 7
Revise OID SP 1.2
Current:
SP 1.2 Identify and Analyze Innovations
Identify and analyze innovative improvements that could increase
the organization’s quality and process performance.
Proposed:
SP 1.2 Seek Innovations
Seek and investigate innovative processes and technologies that
have potential for significantly improving the organization’s quality
and process performance.
 “Seek and investigate” is more proactive than “identify”
 “Analysis” is deferred to SP 1.3 and SP 1.4
 Focuses on “significant” performance enhancement.
(.65, .50)
30
OID Proposed Change #4 of 7
Add OID SP 1.3
Current: <None>
Proposed:
SP 1.3 Model Improvements
Use process performance models, as appropriate, to
predict the effect of incremental and innovative
improvements in quantitative terms.
 Adds modeling as an additional “filter”
 Supports quantitative cost/benefit analysis.
(.68, .44)
31
OID Proposed Change #5 of 7
Revise OID SP 1.3 (now SP 1.4)
Current:
SP 1.3 Pilot Improvements
Pilot process and technology improvements to select
which ones to implement.
Proposed:
SP 1.4 Pilot Improvements
Pilot proposed improvements, as appropriate, to
evaluate the actual effect on quality and process
performance in quantitative terms.
 Piloting performed “as appropriate”
 Provides rationale for implementation.
(.70, .61)
32
OID Proposed Change #6 of 7
Revise OID SP 1.4 (now SP 1.5)
Current:
SP 1.5 Select Improvements for Deployment
Select process and technology improvements for
deployment across the organization.
Proposed:
SP 1.4 Select Improvements for Deployment
Select process and technology improvements for
deployment across the organization based on an
evaluation of costs, benefits, and other factors.
 Provides cost and benefits as the basis for selection
 “Other factors” provides flexibility.
(.67, .51)
33
OID Proposed Change #7 of 7
Replace OID SP 2.3
Current:
SP 2.3 Measure Improvement Effects
Measure the effects of the deployed process and
technology improvements.
Proposed:
SP 2.3 Measure Improvement Effects
Evaluate the effects of deployed improvements on
quality and process performance in quantitative terms.
 Specifies evaluation criteria
 Indicates “quantitative” evaluation
 New informative material – update baselines/models.
(.70, .63)
34
What’s Next?
35
Change Requests
1. Since the feedback related to the
proposed changes was primarily
supportive, all will be submitted as Change
Requests to the SEI for consideration.
2. Change request submitted for UCHMP
course – add exercise to re-write high
maturity practices using ATLAS results as
the base.
36
Now It’s YOUR Turn!
Handout contains ATLAS #12Z proposing:
 Consolidating ML5 PAs into ML4
 Changing ML5 to “Sustaining Excellence”
 Achieve ML4
 ML4 = OPP, QPM, CAR, & OID
 No additional process areas at ML5
 Perform at high maturity for 2 contiguous years
 Demonstrate sustained business benefit as well
 Submit your input to PACT.otoole@att.net
 Results will be published to all submitters.
37
Questions?
???
38
Download & Contact Information
Refer to the following websites to:
 Contact the authors
 Download the final SEPG 2008 presentation
 Download the supporting ATLAS 12A – 12D
results
 Download the CMMI Model and SCAMPI Method
Changes presentation from the May 2007 San
Francisco Beyond CMMI v1.2 Workshop
Herb Weiner
Herb.Weiner@welchallyn.com
www.highmaturity.com
Pat O’Toole
PACT.otoole@att.net
www.pactcmmi.com

Contenu connexe

Tendances

Core tools apqp, ppap, fmea, spc and msa
Core tools   apqp, ppap, fmea, spc and msa Core tools   apqp, ppap, fmea, spc and msa
Core tools apqp, ppap, fmea, spc and msa Mouhcine Nahal
 
Ensuring Project Quality Powerpoint Presentation Slides
Ensuring Project Quality Powerpoint Presentation SlidesEnsuring Project Quality Powerpoint Presentation Slides
Ensuring Project Quality Powerpoint Presentation SlidesSlideTeam
 
Process auditing as per VDA 6.3
Process auditing as per VDA 6.3Process auditing as per VDA 6.3
Process auditing as per VDA 6.3Kiran Walimbe
 
A lean model based outlook on cost & quality optimization in software projects
A lean model based outlook on cost & quality optimization in software projectsA lean model based outlook on cost & quality optimization in software projects
A lean model based outlook on cost & quality optimization in software projectsSonata Software
 
Quality management strategy example
Quality management strategy exampleQuality management strategy example
Quality management strategy exampleselinasimpson1201
 
Project Quality Management Plan Checklist PowerPoint Presentation Slides
Project Quality Management Plan Checklist PowerPoint Presentation SlidesProject Quality Management Plan Checklist PowerPoint Presentation Slides
Project Quality Management Plan Checklist PowerPoint Presentation SlidesSlideTeam
 
Integrated methodology for testing and quality management.
Integrated methodology for testing and quality management.Integrated methodology for testing and quality management.
Integrated methodology for testing and quality management.Mindtree Ltd.
 
ISTQB Advanced Study Guide - 8
ISTQB Advanced Study Guide - 8ISTQB Advanced Study Guide - 8
ISTQB Advanced Study Guide - 8Yogindernath Gupta
 

Tendances (20)

Introduction to CMMI-DEV v1.3 - Day 4
Introduction to CMMI-DEV v1.3  - Day 4Introduction to CMMI-DEV v1.3  - Day 4
Introduction to CMMI-DEV v1.3 - Day 4
 
Ch 7(spi)intro tocm-mi2013
Ch 7(spi)intro tocm-mi2013Ch 7(spi)intro tocm-mi2013
Ch 7(spi)intro tocm-mi2013
 
Core tools apqp, ppap, fmea, spc and msa
Core tools   apqp, ppap, fmea, spc and msa Core tools   apqp, ppap, fmea, spc and msa
Core tools apqp, ppap, fmea, spc and msa
 
Ga article
Ga articleGa article
Ga article
 
Project quality management
Project quality managementProject quality management
Project quality management
 
Introduction to CMMI-DEV v1.3 - Day 1
Introduction to CMMI-DEV v1.3  - Day 1Introduction to CMMI-DEV v1.3  - Day 1
Introduction to CMMI-DEV v1.3 - Day 1
 
TMMi Implementation Guideline
TMMi Implementation GuidelineTMMi Implementation Guideline
TMMi Implementation Guideline
 
Quality Assurance in Software Ind.
Quality Assurance in Software Ind.Quality Assurance in Software Ind.
Quality Assurance in Software Ind.
 
The Process Audit
The Process AuditThe Process Audit
The Process Audit
 
Ensuring Project Quality Powerpoint Presentation Slides
Ensuring Project Quality Powerpoint Presentation SlidesEnsuring Project Quality Powerpoint Presentation Slides
Ensuring Project Quality Powerpoint Presentation Slides
 
5. project quality management
5. project quality management5. project quality management
5. project quality management
 
Tqm ch 06
Tqm ch 06Tqm ch 06
Tqm ch 06
 
Process auditing as per VDA 6.3
Process auditing as per VDA 6.3Process auditing as per VDA 6.3
Process auditing as per VDA 6.3
 
Project quality management
Project quality managementProject quality management
Project quality management
 
csc 510 Project
csc 510 Projectcsc 510 Project
csc 510 Project
 
A lean model based outlook on cost & quality optimization in software projects
A lean model based outlook on cost & quality optimization in software projectsA lean model based outlook on cost & quality optimization in software projects
A lean model based outlook on cost & quality optimization in software projects
 
Quality management strategy example
Quality management strategy exampleQuality management strategy example
Quality management strategy example
 
Project Quality Management Plan Checklist PowerPoint Presentation Slides
Project Quality Management Plan Checklist PowerPoint Presentation SlidesProject Quality Management Plan Checklist PowerPoint Presentation Slides
Project Quality Management Plan Checklist PowerPoint Presentation Slides
 
Integrated methodology for testing and quality management.
Integrated methodology for testing and quality management.Integrated methodology for testing and quality management.
Integrated methodology for testing and quality management.
 
ISTQB Advanced Study Guide - 8
ISTQB Advanced Study Guide - 8ISTQB Advanced Study Guide - 8
ISTQB Advanced Study Guide - 8
 

En vedette

[slides] Gestão de Projetos (2015: 2º semestre)
[slides] Gestão de Projetos (2015: 2º semestre)[slides] Gestão de Projetos (2015: 2º semestre)
[slides] Gestão de Projetos (2015: 2º semestre)Alessandro Almeida
 
[palestra] Definindo a proposta de valor com Design Thinking e Canvas
[palestra] Definindo a proposta de valor com Design Thinking e Canvas[palestra] Definindo a proposta de valor com Design Thinking e Canvas
[palestra] Definindo a proposta de valor com Design Thinking e CanvasAlessandro Almeida
 
Apresentação da ARANATECH Engenharia Elétrica São Carlos
Apresentação da ARANATECH Engenharia Elétrica São CarlosApresentação da ARANATECH Engenharia Elétrica São Carlos
Apresentação da ARANATECH Engenharia Elétrica São CarlosEdgar Arana
 
[slides] CMMI (2011: 1º semestre)
[slides] CMMI (2011: 1º semestre)[slides] CMMI (2011: 1º semestre)
[slides] CMMI (2011: 1º semestre)Alessandro Almeida
 
[metodologia] Definição da Proposta de Valor
[metodologia] Definição da Proposta de Valor[metodologia] Definição da Proposta de Valor
[metodologia] Definição da Proposta de ValorAlessandro Almeida
 
Gestão da Mudança Organizacional (2ª edição - 11/10/2017)
Gestão da Mudança Organizacional (2ª edição - 11/10/2017)Gestão da Mudança Organizacional (2ª edição - 11/10/2017)
Gestão da Mudança Organizacional (2ª edição - 11/10/2017)Alessandro Almeida
 
[Avaliação da Disciplina] Introdução à Gestão de Projetos (2016: 1º semestre)
[Avaliação da Disciplina] Introdução à Gestão de Projetos (2016: 1º semestre)[Avaliação da Disciplina] Introdução à Gestão de Projetos (2016: 1º semestre)
[Avaliação da Disciplina] Introdução à Gestão de Projetos (2016: 1º semestre)Alessandro Almeida
 
Status Report dos TCCs: SIN-NA7 - 2015_2º semestre
Status Report dos TCCs: SIN-NA7 - 2015_2º semestreStatus Report dos TCCs: SIN-NA7 - 2015_2º semestre
Status Report dos TCCs: SIN-NA7 - 2015_2º semestreAlessandro Almeida
 
Adminstracion de los Recursos Humanos
Adminstracion de los Recursos HumanosAdminstracion de los Recursos Humanos
Adminstracion de los Recursos Humanosjoseph251
 
[Modelo de Negócios] TCC: TADS (2016 - 2º semestre)
[Modelo de Negócios] TCC: TADS (2016 - 2º semestre)[Modelo de Negócios] TCC: TADS (2016 - 2º semestre)
[Modelo de Negócios] TCC: TADS (2016 - 2º semestre)Alessandro Almeida
 
Minicurso - Aplicando o Design Thinking para definir a proposta de valor e o ...
Minicurso - Aplicando o Design Thinking para definir a proposta de valor e o ...Minicurso - Aplicando o Design Thinking para definir a proposta de valor e o ...
Minicurso - Aplicando o Design Thinking para definir a proposta de valor e o ...Alessandro Almeida
 
Templates: Mapa da Empatia, Canvas da Proposta de Valor, Canvas do Modelo de ...
Templates: Mapa da Empatia, Canvas da Proposta de Valor, Canvas do Modelo de ...Templates: Mapa da Empatia, Canvas da Proposta de Valor, Canvas do Modelo de ...
Templates: Mapa da Empatia, Canvas da Proposta de Valor, Canvas do Modelo de ...Alessandro Almeida
 
Scrum - Desenvolvimento Ágil
Scrum - Desenvolvimento ÁgilScrum - Desenvolvimento Ágil
Scrum - Desenvolvimento ÁgilIsrael Santiago
 
[Avaliação da Disciplina] Planejamento, Execução e Controle de Projetos (2016...
[Avaliação da Disciplina] Planejamento, Execução e Controle de Projetos (2016...[Avaliação da Disciplina] Planejamento, Execução e Controle de Projetos (2016...
[Avaliação da Disciplina] Planejamento, Execução e Controle de Projetos (2016...Alessandro Almeida
 
[Modelo de Negócios] TCC: Sistemas de Informação (2016 - 2º semestre)
[Modelo de Negócios] TCC: Sistemas de Informação (2016 - 2º semestre)[Modelo de Negócios] TCC: Sistemas de Informação (2016 - 2º semestre)
[Modelo de Negócios] TCC: Sistemas de Informação (2016 - 2º semestre)Alessandro Almeida
 
Status Report dos TCCs (SIN-NA8): 2º semestre de 2016
Status Report dos TCCs (SIN-NA8): 2º semestre de 2016Status Report dos TCCs (SIN-NA8): 2º semestre de 2016
Status Report dos TCCs (SIN-NA8): 2º semestre de 2016Alessandro Almeida
 

En vedette (20)

[slides] Gestão de Projetos (2015: 2º semestre)
[slides] Gestão de Projetos (2015: 2º semestre)[slides] Gestão de Projetos (2015: 2º semestre)
[slides] Gestão de Projetos (2015: 2º semestre)
 
[palestra] Definindo a proposta de valor com Design Thinking e Canvas
[palestra] Definindo a proposta de valor com Design Thinking e Canvas[palestra] Definindo a proposta de valor com Design Thinking e Canvas
[palestra] Definindo a proposta de valor com Design Thinking e Canvas
 
Apresentação da ARANATECH Engenharia Elétrica São Carlos
Apresentação da ARANATECH Engenharia Elétrica São CarlosApresentação da ARANATECH Engenharia Elétrica São Carlos
Apresentação da ARANATECH Engenharia Elétrica São Carlos
 
Catalogo
CatalogoCatalogo
Catalogo
 
[slides] CMMI (2011: 1º semestre)
[slides] CMMI (2011: 1º semestre)[slides] CMMI (2011: 1º semestre)
[slides] CMMI (2011: 1º semestre)
 
[metodologia] Definição da Proposta de Valor
[metodologia] Definição da Proposta de Valor[metodologia] Definição da Proposta de Valor
[metodologia] Definição da Proposta de Valor
 
Workshop ietec Devops Testing
Workshop ietec Devops TestingWorkshop ietec Devops Testing
Workshop ietec Devops Testing
 
Gestão da Mudança Organizacional (2ª edição - 11/10/2017)
Gestão da Mudança Organizacional (2ª edição - 11/10/2017)Gestão da Mudança Organizacional (2ª edição - 11/10/2017)
Gestão da Mudança Organizacional (2ª edição - 11/10/2017)
 
[Avaliação da Disciplina] Introdução à Gestão de Projetos (2016: 1º semestre)
[Avaliação da Disciplina] Introdução à Gestão de Projetos (2016: 1º semestre)[Avaliação da Disciplina] Introdução à Gestão de Projetos (2016: 1º semestre)
[Avaliação da Disciplina] Introdução à Gestão de Projetos (2016: 1º semestre)
 
Status Report dos TCCs: SIN-NA7 - 2015_2º semestre
Status Report dos TCCs: SIN-NA7 - 2015_2º semestreStatus Report dos TCCs: SIN-NA7 - 2015_2º semestre
Status Report dos TCCs: SIN-NA7 - 2015_2º semestre
 
Adminstracion de los Recursos Humanos
Adminstracion de los Recursos HumanosAdminstracion de los Recursos Humanos
Adminstracion de los Recursos Humanos
 
[Modelo de Negócios] TCC: TADS (2016 - 2º semestre)
[Modelo de Negócios] TCC: TADS (2016 - 2º semestre)[Modelo de Negócios] TCC: TADS (2016 - 2º semestre)
[Modelo de Negócios] TCC: TADS (2016 - 2º semestre)
 
Minicurso - Aplicando o Design Thinking para definir a proposta de valor e o ...
Minicurso - Aplicando o Design Thinking para definir a proposta de valor e o ...Minicurso - Aplicando o Design Thinking para definir a proposta de valor e o ...
Minicurso - Aplicando o Design Thinking para definir a proposta de valor e o ...
 
Templates: Mapa da Empatia, Canvas da Proposta de Valor, Canvas do Modelo de ...
Templates: Mapa da Empatia, Canvas da Proposta de Valor, Canvas do Modelo de ...Templates: Mapa da Empatia, Canvas da Proposta de Valor, Canvas do Modelo de ...
Templates: Mapa da Empatia, Canvas da Proposta de Valor, Canvas do Modelo de ...
 
Scrum - Desenvolvimento Ágil
Scrum - Desenvolvimento ÁgilScrum - Desenvolvimento Ágil
Scrum - Desenvolvimento Ágil
 
[Avaliação da Disciplina] Planejamento, Execução e Controle de Projetos (2016...
[Avaliação da Disciplina] Planejamento, Execução e Controle de Projetos (2016...[Avaliação da Disciplina] Planejamento, Execução e Controle de Projetos (2016...
[Avaliação da Disciplina] Planejamento, Execução e Controle de Projetos (2016...
 
Administração financeira
Administração financeiraAdministração financeira
Administração financeira
 
[Modelo de Negócios] TCC: Sistemas de Informação (2016 - 2º semestre)
[Modelo de Negócios] TCC: Sistemas de Informação (2016 - 2º semestre)[Modelo de Negócios] TCC: Sistemas de Informação (2016 - 2º semestre)
[Modelo de Negócios] TCC: Sistemas de Informação (2016 - 2º semestre)
 
Etic: Estrategia Bolivia
Etic: Estrategia BoliviaEtic: Estrategia Bolivia
Etic: Estrategia Bolivia
 
Status Report dos TCCs (SIN-NA8): 2º semestre de 2016
Status Report dos TCCs (SIN-NA8): 2º semestre de 2016Status Report dos TCCs (SIN-NA8): 2º semestre de 2016
Status Report dos TCCs (SIN-NA8): 2º semestre de 2016
 

Similaire à CMMI Model Changes for High Maturity Practices

PECB Webinar: Aligning ISO 25000 and CMMI for Development
PECB Webinar: Aligning ISO 25000 and CMMI for DevelopmentPECB Webinar: Aligning ISO 25000 and CMMI for Development
PECB Webinar: Aligning ISO 25000 and CMMI for DevelopmentPECB
 
Assessing Your Processes using ISO Standards
Assessing Your Processes using ISO StandardsAssessing Your Processes using ISO Standards
Assessing Your Processes using ISO StandardsPECB
 
РАМЕЛЛА БАСЕНКО «Поліпшення процесу тестування, як результат аудиту процесів ...
РАМЕЛЛА БАСЕНКО «Поліпшення процесу тестування, як результат аудиту процесів ...РАМЕЛЛА БАСЕНКО «Поліпшення процесу тестування, як результат аудиту процесів ...
РАМЕЛЛА БАСЕНКО «Поліпшення процесу тестування, як результат аудиту процесів ...GoQA
 
Chap.9 the key process areas for level 4
Chap.9 the key process areas for level 4Chap.9 the key process areas for level 4
Chap.9 the key process areas for level 4Prince Bhanwra
 
BPR - Benchmarking, Process Analysis, Incentives, Motivation, Quality & Trends
BPR - Benchmarking, Process Analysis, Incentives, Motivation, Quality & TrendsBPR - Benchmarking, Process Analysis, Incentives, Motivation, Quality & Trends
BPR - Benchmarking, Process Analysis, Incentives, Motivation, Quality & TrendsRajesh Timane, PhD
 
Best Practices Fusion: Lean Six Sigma & CMMI
Best Practices Fusion: Lean Six Sigma & CMMIBest Practices Fusion: Lean Six Sigma & CMMI
Best Practices Fusion: Lean Six Sigma & CMMIggack
 
9.process improvement chapter 9
9.process improvement chapter 99.process improvement chapter 9
9.process improvement chapter 9Warui Maina
 
Awareness of iatf 16949
Awareness of iatf 16949Awareness of iatf 16949
Awareness of iatf 16949Pavan Patil
 
Sanitized Knowledge Transfer Deliverable:Rapid Process Change Tutorial
Sanitized Knowledge Transfer Deliverable:Rapid Process Change TutorialSanitized Knowledge Transfer Deliverable:Rapid Process Change Tutorial
Sanitized Knowledge Transfer Deliverable:Rapid Process Change Tutorialguesta83e9d
 
Business Process Reengineering | Case studies
Business Process Reengineering | Case studiesBusiness Process Reengineering | Case studies
Business Process Reengineering | Case studiesSumit Sanyal
 
Lean Six Sigma Awareness Handouts
Lean Six Sigma Awareness HandoutsLean Six Sigma Awareness Handouts
Lean Six Sigma Awareness Handoutsguest0981f4
 
LSS Intro
LSS IntroLSS Intro
LSS Introevansjr
 
Lean Six Sigma Awareness Handouts
Lean Six Sigma Awareness HandoutsLean Six Sigma Awareness Handouts
Lean Six Sigma Awareness Handoutsguest0981f4
 
k10998 WASIQUE AHMAD
k10998 WASIQUE AHMADk10998 WASIQUE AHMAD
k10998 WASIQUE AHMADWasique Ahmad
 
scribd.vdownloaders.com_day-6-quality.pdf
scribd.vdownloaders.com_day-6-quality.pdfscribd.vdownloaders.com_day-6-quality.pdf
scribd.vdownloaders.com_day-6-quality.pdfAbdullahSamy6
 

Similaire à CMMI Model Changes for High Maturity Practices (20)

PECB Webinar: Aligning ISO 25000 and CMMI for Development
PECB Webinar: Aligning ISO 25000 and CMMI for DevelopmentPECB Webinar: Aligning ISO 25000 and CMMI for Development
PECB Webinar: Aligning ISO 25000 and CMMI for Development
 
Assessing Your Processes using ISO Standards
Assessing Your Processes using ISO StandardsAssessing Your Processes using ISO Standards
Assessing Your Processes using ISO Standards
 
Ch28
Ch28Ch28
Ch28
 
РАМЕЛЛА БАСЕНКО «Поліпшення процесу тестування, як результат аудиту процесів ...
РАМЕЛЛА БАСЕНКО «Поліпшення процесу тестування, як результат аудиту процесів ...РАМЕЛЛА БАСЕНКО «Поліпшення процесу тестування, як результат аудиту процесів ...
РАМЕЛЛА БАСЕНКО «Поліпшення процесу тестування, як результат аудиту процесів ...
 
Chap.9 the key process areas for level 4
Chap.9 the key process areas for level 4Chap.9 the key process areas for level 4
Chap.9 the key process areas for level 4
 
BPR - Benchmarking, Process Analysis, Incentives, Motivation, Quality & Trends
BPR - Benchmarking, Process Analysis, Incentives, Motivation, Quality & TrendsBPR - Benchmarking, Process Analysis, Incentives, Motivation, Quality & Trends
BPR - Benchmarking, Process Analysis, Incentives, Motivation, Quality & Trends
 
Best Practices Fusion: Lean Six Sigma & CMMI
Best Practices Fusion: Lean Six Sigma & CMMIBest Practices Fusion: Lean Six Sigma & CMMI
Best Practices Fusion: Lean Six Sigma & CMMI
 
Spice
SpiceSpice
Spice
 
9.process improvement chapter 9
9.process improvement chapter 99.process improvement chapter 9
9.process improvement chapter 9
 
Spice
SpiceSpice
Spice
 
Awareness of iatf 16949
Awareness of iatf 16949Awareness of iatf 16949
Awareness of iatf 16949
 
Sanitized Knowledge Transfer Deliverable:Rapid Process Change Tutorial
Sanitized Knowledge Transfer Deliverable:Rapid Process Change TutorialSanitized Knowledge Transfer Deliverable:Rapid Process Change Tutorial
Sanitized Knowledge Transfer Deliverable:Rapid Process Change Tutorial
 
Business Process Reengineering | Case studies
Business Process Reengineering | Case studiesBusiness Process Reengineering | Case studies
Business Process Reengineering | Case studies
 
Ch 12(spi)cm mi scampi
Ch 12(spi)cm mi scampiCh 12(spi)cm mi scampi
Ch 12(spi)cm mi scampi
 
Lean Six Sigma Awareness Handouts
Lean Six Sigma Awareness HandoutsLean Six Sigma Awareness Handouts
Lean Six Sigma Awareness Handouts
 
LSS Intro
LSS IntroLSS Intro
LSS Intro
 
Lean Six Sigma Awareness Handouts
Lean Six Sigma Awareness HandoutsLean Six Sigma Awareness Handouts
Lean Six Sigma Awareness Handouts
 
k10998 WASIQUE
k10998 WASIQUE k10998 WASIQUE
k10998 WASIQUE
 
k10998 WASIQUE AHMAD
k10998 WASIQUE AHMADk10998 WASIQUE AHMAD
k10998 WASIQUE AHMAD
 
scribd.vdownloaders.com_day-6-quality.pdf
scribd.vdownloaders.com_day-6-quality.pdfscribd.vdownloaders.com_day-6-quality.pdf
scribd.vdownloaders.com_day-6-quality.pdf
 

Plus de JULIO GONZALEZ SANZ

Introduction to bayesian_networks[1]
Introduction to bayesian_networks[1]Introduction to bayesian_networks[1]
Introduction to bayesian_networks[1]JULIO GONZALEZ SANZ
 
Workshop healthy ingredients ppm[1]
Workshop healthy ingredients ppm[1]Workshop healthy ingredients ppm[1]
Workshop healthy ingredients ppm[1]JULIO GONZALEZ SANZ
 
The need for a balanced measurement system
The need for a balanced measurement systemThe need for a balanced measurement system
The need for a balanced measurement systemJULIO GONZALEZ SANZ
 
Just in-time and lean production
Just in-time and lean productionJust in-time and lean production
Just in-time and lean productionJULIO GONZALEZ SANZ
 
History of manufacturing systems and lean thinking enfr
History of manufacturing systems and lean thinking enfrHistory of manufacturing systems and lean thinking enfr
History of manufacturing systems and lean thinking enfrJULIO GONZALEZ SANZ
 
Une 66175 presentacion norma 2006 por julio
Une 66175 presentacion norma 2006 por julioUne 66175 presentacion norma 2006 por julio
Une 66175 presentacion norma 2006 por julioJULIO GONZALEZ SANZ
 
An architecture for data quality
An architecture for data qualityAn architecture for data quality
An architecture for data qualityJULIO GONZALEZ SANZ
 
Sap analytics creating smart business processes
Sap analytics   creating smart business processesSap analytics   creating smart business processes
Sap analytics creating smart business processesJULIO GONZALEZ SANZ
 
Big data analytics, research report
Big data analytics, research reportBig data analytics, research report
Big data analytics, research reportJULIO GONZALEZ SANZ
 
Evaluating and comparing software metrics in the software engineering laboratory
Evaluating and comparing software metrics in the software engineering laboratoryEvaluating and comparing software metrics in the software engineering laboratory
Evaluating and comparing software metrics in the software engineering laboratoryJULIO GONZALEZ SANZ
 
The complexity of social networks
The complexity of social networksThe complexity of social networks
The complexity of social networksJULIO GONZALEZ SANZ
 

Plus de JULIO GONZALEZ SANZ (20)

Cmmi 26 ago_2009_
Cmmi 26 ago_2009_Cmmi 26 ago_2009_
Cmmi 26 ago_2009_
 
Introduction to bayesian_networks[1]
Introduction to bayesian_networks[1]Introduction to bayesian_networks[1]
Introduction to bayesian_networks[1]
 
Workshop healthy ingredients ppm[1]
Workshop healthy ingredients ppm[1]Workshop healthy ingredients ppm[1]
Workshop healthy ingredients ppm[1]
 
The need for a balanced measurement system
The need for a balanced measurement systemThe need for a balanced measurement system
The need for a balanced measurement system
 
Magic quadrant
Magic quadrantMagic quadrant
Magic quadrant
 
6 six sigma presentation
6 six sigma presentation6 six sigma presentation
6 six sigma presentation
 
Volvo csr suppliers guide vsib
Volvo csr suppliers guide vsibVolvo csr suppliers guide vsib
Volvo csr suppliers guide vsib
 
Just in-time and lean production
Just in-time and lean productionJust in-time and lean production
Just in-time and lean production
 
History of manufacturing systems and lean thinking enfr
History of manufacturing systems and lean thinking enfrHistory of manufacturing systems and lean thinking enfr
History of manufacturing systems and lean thinking enfr
 
Using minitab exec files
Using minitab exec filesUsing minitab exec files
Using minitab exec files
 
Sga iso-14001
Sga iso-14001Sga iso-14001
Sga iso-14001
 
Cslt closing plenary_portugal
Cslt closing plenary_portugalCslt closing plenary_portugal
Cslt closing plenary_portugal
 
Une 66175 presentacion norma 2006 por julio
Une 66175 presentacion norma 2006 por julioUne 66175 presentacion norma 2006 por julio
Une 66175 presentacion norma 2006 por julio
 
Swebokv3
Swebokv3 Swebokv3
Swebokv3
 
An architecture for data quality
An architecture for data qualityAn architecture for data quality
An architecture for data quality
 
Sap analytics creating smart business processes
Sap analytics   creating smart business processesSap analytics   creating smart business processes
Sap analytics creating smart business processes
 
Big data analytics, research report
Big data analytics, research reportBig data analytics, research report
Big data analytics, research report
 
Evaluating and comparing software metrics in the software engineering laboratory
Evaluating and comparing software metrics in the software engineering laboratoryEvaluating and comparing software metrics in the software engineering laboratory
Evaluating and comparing software metrics in the software engineering laboratory
 
Cmmi six sigma bok
Cmmi six sigma bokCmmi six sigma bok
Cmmi six sigma bok
 
The complexity of social networks
The complexity of social networksThe complexity of social networks
The complexity of social networks
 

CMMI Model Changes for High Maturity Practices

  • 1. 1 CMMI Model Changes for High Maturity Herb Weiner Pat O’Toole 2008 SEPG Conference Tampa, Florida
  • 2. 2 Problem Statement High maturity practices are not consistently understood, applied, or appraised  SEI is addressing the training and appraisal portions of the CMMI Product Suite; e.g.,  Understanding CMMI High Maturity Practices course  Several recent presentations by SEI personnel  High Maturity Lead Appraisers certification  However, there is insufficient foundation for these “raise-the-floor” interpretations in CMMI v1.2  Goals do not establish the requirements  Practices do not establish the expectations  Informative material purported to take on greater importance.
  • 3. 3 Eating Your Own Dog Food Requirements Management SG1:  Requirements are managed and inconsistencies with project plans and work products are identified CMMI Product Suite Management SG1:  CMMI model requirements are managed and inconsistencies with CMMI training courses and appraisal methods are identified.
  • 4. 4 Approach  Draft proposed changes  CMMI Model & SCAMPI Method Changes for High Maturity (Herb Weiner, May 2007)  Solicit feedback from SEI authorized people via ATLAS  ATLAS = Ask The Lead AppraiserS  ATLAS has been expanded to include CMMI instructors  Candidate lead appraisers and instructors also included  Publish results to SEI authorized individuals  Submit CRs to SEI for consideration  Update model to re-align the CMMI Product Suite.
  • 5. 5 ATLAS Feedback For each proposed change, respondents indicated:  Strongly support (It’s perfect!)  Support (It’s better)  Are ambivalent (It’s OK either way)  Disagree (It’s worse)  Strongly disagree (What were you thinking?) Ratings were determined on a +1 to -1 scale as follows:  Strongly support = +1.0  Support = +0.5  Ambivalent = 0.0  Disagree = -0.5  Strongly disagree = -1.0 For each change, the average rating will be displayed for:  [High Maturity Lead Appraisers, Other SEI authorized individuals]
  • 7. 7 OPP Proposed Change #1 of 4 Move SP 1.3 to SP 1.1 Current: SP 1.1 Select Processes SP 1.2 Establish Process-Performance Measures SP 1.3 Establish Quality and Process-Performance Objectives Proposed: SP 1.1 Establish Quality and Process-Performance Objectives SP 1.2 Select Processes SP 1.3 Establish Process-Performance Measures  MA, OPF, and QPM establish objectives in SP 1.1. (.50, .51)
  • 8. 8 OPP Proposed Change #2 of 4 Revise OPP SP 1.4 Current: Establish and maintain the organization’s process- performance baselines. Proposed: Conduct process-performance analyses on the selected processes and subprocesses to verify process stability and to establish and maintain the organization’s process-performance baselines.  SP 1.1 & 1.2 indicate process-performance analysis will be conducted, but that’s the last we hear of it  Baselines are established for stable processes  Elevate this from informative to expected. (.39, .42)
  • 9. 9 OPP Proposed Change #3 of 4 Revise OPP SP 1.5 Current: Establish and maintain the process-performance models for the organization’s set of standard processes. Proposed: Establish and maintain models that predict process performance related to the quality and process-performance objectives.  The SEI’s new training courses emphasize use of process- performance models with respect to quantitative objectives  Focusing this practice on these objectives achieves better alignment between the model and training. (.59, .50)
  • 10. 10 OPP Proposed Change #4 of 4 Enhance the informative material Proposed: Modify informative material that suggests improving process performance such as the examples found in OPP SP 1.3 (which imply that common causes of variation be addressed) Add new informative material should indicate that, at ML4/CL4, achieving such improvement might be addressed via OPF and GP3.1, while at ML5/CL5, it is more likely to be achieved through CAR, OID, and GP5.2  In order to delineate level 4 from level 5, the model should avoid implying that common causes of variation are addressed at level 4  ML4/CL4: Process stability / execution consistency / special causes  ML5/CL5: Improving capability / systemic improvement / common causes. (.36, .44)
  • 12. 12 QPM Proposed Change #1 of 4 Revise QPM SP 1.4 Current: SP 1.4 Manage Project Performance Monitor the project to determine whether the project’s objectives for quality and process performance will be satisfied, and identify corrective action as appropriate. Proposed: SP 1.4 Analyze Project Performance Analyze the collective performance of the project's subprocesses to predict whether the project's objectives for quality and process performance will be satisfied and identify the need for corrective action as appropriate.  Fixes mismatch between the current title and practice statement  Recognizes that project management deals with both quantitatively managed, and non-quantitatively managed processes. (.54, .57)
  • 13. 13 QPM Proposed Change #2 of 4 Add QPM SP 1.5 Current: <None> Proposed: SP 1.5 Use Process-Performance Models Use calibrated process-performance models throughout the life cycle to identify, analyze, and execute corrective action when necessary.  Currently, PPMs aren’t expected to be used in QPM  But use throughout life cycle appears to be expected by SEI  PPMs may support process or subprocess activities  Added practice to SG 1, but it could have been added to SG2. (.39, .46)
  • 14. 14 QPM Proposed Change #3 of 4 Add QPM SP 2.3 Current: <None> Proposed: SP 2.3 Address Special Causes of Variation Identify, address, and prevent reoccurrence of special causes of variation in the selected subprocesses.  “Special causes” are featured in SEI materials  Currently “special causes” are only in QPM’s informative material  The Glossary definition of “stable process” includes “…and prevent reoccurrences of special causes”  Add informative material to ensure that process performance data and statistical techniques are used appropriately. (.64, .48)
  • 15. 15 QPM Proposed Change #4 of 4 Revise QPM SP 2.3 (now SP 2.4) Current: SP 2.3 Monitor Performance of the Selected Subprocesses Monitor the performance of the selected subprocesses to determine their capability to satisfy their quality and process- performance objectives, and identify corrective action as necessary. Proposed: SP 2.4 Analyze Performance of the Selected Subprocesses Analyze the performance of the selected subprocesses to predict their capability to satisfy their quality and process-performance objectives, and identify and take corrective action as necessary.  “Analyze” is a much stronger word than “monitor”  “Predict” is a much stronger word than “determine”  Emphasize “taking corrective action,” not just identifying it. (.59, .46)
  • 17. 17 CAR Proposed Change #1 of 7 Thematic Change  Currently, there is little to suggest that CAR should target statistically managed subprocesses to identify and analyze common causes of variation to address:  Stable processes with unacceptably high standard deviations;  Stable processes not capable of achieving quality or process performance objectives; and  Stable and capable processes that might be improved to enhance competitive advantage  Change the focus of CAR’s specific goals and practices from “defects and other problems” to “problems”  By collapsing this phrase, model users will not limit their application of CAR to the subset of problem candidates called “defects”  Also include a discussion of “opportunities” in the informative material. (.50, .46)
  • 18. 18 CAR Proposed Change #2 of 7 Revise CAR SG 1 Current: SG 1 Determine Causes of Defects Root causes of defects and other problems are systematically determined. Proposed: SG 1 Determine and Analyze Causes Common causes of variation and root causes of problems are systematically analyzed.  Reflects the Thematic Change  “Analyzed” is a stronger word than “determined”. (.56, .63)
  • 19. 19 CAR Proposed Change #3 of 7 Revise CAR SP 1.1 Current: SP 1.1 Select Defect Data for Analysis Select the defects and other problems for analysis. Proposed: SP 1.1 Select Data for Analysis Select for analysis, using established criteria, quantitatively managed processes that are candidates for improvement as well as problems that have a significant effect on quality and process performance.  Reflects the Thematic Change  “Significant effect” emphasizes quantitatively managed processes. (.64, .53)
  • 20. 20 CAR Proposed Change #4 of 7 Revise CAR SP 1.2 and add SP1.3-SP 1.4 Current: SP 1.2 Analyze Causes Perform causal analysis of selected defects and other problems and propose actions to address them. Proposed: SP 1.2 Analyze Common Causes Analyze common causes of variation to understand the inherent quality and process performance constraints. SP 1.3 Analyze Root Causes Perform causal analysis on selected problems to determine their root causes. SP 1.4 Propose Actions to Address Causes Propose actions to address selected common causes of variation and to prevent recurrence of selected problems.  Reflects the Thematic Change.  Establishes expectations for BOTH common causes and root causes. (.44, .57)
  • 21. 21 CAR Proposed Change #5 of 7 Add CAR SP 1.5 Current: <None> Proposed: SP 1.5 Predict Effects of Proposed Actions Use process performance models and statistical techniques to predict, in quantitative terms, the effects of the proposed actions, as appropriate.  Reflects the SEI’s expected use of PPMs and statistical methods in high maturity organizations  Supports proper cost/benefit analysis. (.52, .58)
  • 22. 22 CAR Proposed Change #6 of 7 Revise CAR SG 2, SP 2.1 – SP 2.2 Current: SG 2 Analyze Causes Root causes of defects and other problems are systematically addressed to prevent their future occurrence. SP 2.1 Implement the Action Proposals Implement the selected action proposals that were developed in causal analysis. SP 2.2 Evaluate the Effect of Changes Evaluate the effect of changes on process performance. Proposed: SG 2 Address Causes Common causes of variation and root causes of problems are systematically addressed to quantitatively improve quality and process performance. SP 2.1 Implement the Action Proposals Implement selected action proposals that are predicted to achieve a measurable improvement in quality and process performance. SP 2.2 Evaluate the Effect of Implemented Actions Evaluate the effect of implemented actions on quality and process performance.
  • 23. 23 CAR Proposed Change #6 of 7 Proposed: (Copied from previous slide) SG 2 Address Causes Common causes of variation and root causes of problems are systematically addressed to quantitatively improve quality and process performance. SP 2.1 Implement the Action Proposals Implement selected action proposals that are predicted to achieve a measurable improvement in quality and process performance. SP 2.2 Evaluate the Effect of Implemented Actions Evaluate the effect of implemented actions on quality and process performance.  Reflects the Thematic Change  Wording enhanced to focus on measurable improvement of “quality and process performance” – a phrase reserved for high maturity practices  SP 2.2 modified to include quality as well as process performance  A perceived oversight in the current practice. (.46, .64)
  • 24. 24 CAR Proposed Change #7 of 7 Revise CAR SP 2.3 Current: SP 2.3 Record Data Record causal analysis and resolution data for use across the project and organization. Proposed: SP 2.3 Submit Improvement Proposals Submit process- and technology-improvement proposals based on implemented actions, as appropriate.  Proposed practice relies on OID to determine “use across the project and organization”  Recognizes that CAR may have been applied locally but the resulting improvements may be more broadly applicable. (.48, .41)
  • 25. 25 CAR Proposed Change #8 of 7 CAR is the only high maturity process area with no lower-level foundation  OPP – OPD & MA  QPM – PP, PMC & IPM  OID – OPF & OPD Several alternatives were explored via ATLAS: 0. Leave CAR exactly as it is 1. Add “Causal Analysis” PA at ML2 2. Add “Causal Analysis” PA at ML3 3. Add “Causal Analysis” practice to PMC SG2 4. ADD “Issue & Causal Analysis” PA at ML2 5. Add “Causal Analysis” goal to OPF (-.08,-.19) (-.45,-.55) (-.45,-.26) (+.09,+.16) (-.55,-.22) (-.45,-.22)
  • 27. 27 OID Proposed Change #1 of 7 Revise OID SG 1 Current: SG 1 Select Improvements Process and technology improvements, which contribute to meeting quality and process-performance objectives, are selected. Proposed: SG 1 Select Improvements Process and technology improvements are identified proactively, evaluated quantitatively, and selected for deployment based on their contribution to quality and process performance.  Somewhat passive vs. very proactive  Focus on quantitative evaluation and ongoing improvement. (.66, .63)
  • 28. 28 OID Proposed Change #2 of 7 Revise OID SP 1.1 Current: SP 1.1 Collect and Analyze Improvement Proposals Collect and analyze process- and technology-improvement proposals. Proposed: SP 1.1 Solicit Improvement Proposals Solicit proposals for incremental process and technology improvements.  “Solicit” is more proactive than “collect”  “Analysis” is deferred to SP 1.3 and SP 1.4  Explicitly targets incremental improvements. (.66, .43)
  • 29. 29 OID Proposed Change #3 of 7 Revise OID SP 1.2 Current: SP 1.2 Identify and Analyze Innovations Identify and analyze innovative improvements that could increase the organization’s quality and process performance. Proposed: SP 1.2 Seek Innovations Seek and investigate innovative processes and technologies that have potential for significantly improving the organization’s quality and process performance.  “Seek and investigate” is more proactive than “identify”  “Analysis” is deferred to SP 1.3 and SP 1.4  Focuses on “significant” performance enhancement. (.65, .50)
  • 30. 30 OID Proposed Change #4 of 7 Add OID SP 1.3 Current: <None> Proposed: SP 1.3 Model Improvements Use process performance models, as appropriate, to predict the effect of incremental and innovative improvements in quantitative terms.  Adds modeling as an additional “filter”  Supports quantitative cost/benefit analysis. (.68, .44)
  • 31. 31 OID Proposed Change #5 of 7 Revise OID SP 1.3 (now SP 1.4) Current: SP 1.3 Pilot Improvements Pilot process and technology improvements to select which ones to implement. Proposed: SP 1.4 Pilot Improvements Pilot proposed improvements, as appropriate, to evaluate the actual effect on quality and process performance in quantitative terms.  Piloting performed “as appropriate”  Provides rationale for implementation. (.70, .61)
  • 32. 32 OID Proposed Change #6 of 7 Revise OID SP 1.4 (now SP 1.5) Current: SP 1.5 Select Improvements for Deployment Select process and technology improvements for deployment across the organization. Proposed: SP 1.4 Select Improvements for Deployment Select process and technology improvements for deployment across the organization based on an evaluation of costs, benefits, and other factors.  Provides cost and benefits as the basis for selection  “Other factors” provides flexibility. (.67, .51)
  • 33. 33 OID Proposed Change #7 of 7 Replace OID SP 2.3 Current: SP 2.3 Measure Improvement Effects Measure the effects of the deployed process and technology improvements. Proposed: SP 2.3 Measure Improvement Effects Evaluate the effects of deployed improvements on quality and process performance in quantitative terms.  Specifies evaluation criteria  Indicates “quantitative” evaluation  New informative material – update baselines/models. (.70, .63)
  • 35. 35 Change Requests 1. Since the feedback related to the proposed changes was primarily supportive, all will be submitted as Change Requests to the SEI for consideration. 2. Change request submitted for UCHMP course – add exercise to re-write high maturity practices using ATLAS results as the base.
  • 36. 36 Now It’s YOUR Turn! Handout contains ATLAS #12Z proposing:  Consolidating ML5 PAs into ML4  Changing ML5 to “Sustaining Excellence”  Achieve ML4  ML4 = OPP, QPM, CAR, & OID  No additional process areas at ML5  Perform at high maturity for 2 contiguous years  Demonstrate sustained business benefit as well  Submit your input to PACT.otoole@att.net  Results will be published to all submitters.
  • 38. 38 Download & Contact Information Refer to the following websites to:  Contact the authors  Download the final SEPG 2008 presentation  Download the supporting ATLAS 12A – 12D results  Download the CMMI Model and SCAMPI Method Changes presentation from the May 2007 San Francisco Beyond CMMI v1.2 Workshop Herb Weiner Herb.Weiner@welchallyn.com www.highmaturity.com Pat O’Toole PACT.otoole@att.net www.pactcmmi.com

Notes de l'éditeur

  1. The full wording of Item #1 and its rationale is: Move SP 1.3 to SP 1.1, shifting SP 1.1 and SP 1.2 to SP 1.2 and SP 1.3 respectively Current:SP 1.1 Select Processes SP 1.2 Establish Process-Performance Measures SP 1.3 Establish Quality and Process-Performance Objectives Proposed:SP 1.1 Establish Quality and Process-Performance Objectives SP 1.2 Select Processes SP 1.3 Establish Process-Performance Measures Rationale:There are three other process areas in which “objectives” are established – MA, OPF, and QPM. In each of these other process areas, objectives are established in SP 1.1, and the other practices focus on accomplishing them. It is suggested here that OPP be structured in the same manner. Granted, the current ordering may have more intuitive appeal to an “emerging” ML4 organization, but the proposed ordering reflects more of a steady state (i.e., institutionalized) condition.
  2. The full wording of Item #2 and its rationale is: OPP SP 1.4 Current:OPP SP 1.4: Establish and maintain the organization’s process-performance baselines. Proposed:OPP SP 1.4: Conduct process-performance analyses on the selected processes and subprocesses to verify process stability and to establish and maintain the organization’s process-performance baselines. Rationale:The current OPP SP 1.1 and SP 1.2 both imply that process-performance analysis will be conducted and yet that’s the last we hear of it – so it is proposed that such analyses are explicitly performed here. In addition, as currently emphasized in the informative material, the proposed practice wording suggests establishing baselines for stable processes, a necessary prerequisite for quantitative management.
  3. The full wording of Item #3 and its rationale is: OPP SP 1.5: Current:OPP SP 1.5: Establish and maintain the process-performance models for the organization’s set of standard processes. Proposed:OPP SP 1.5: Establish and maintain models that predict process performance related to the quality and process-performance objectives. Rationale:The new training courses emphasize the use of process-performance models to compose the defined process and to predict future performance throughout the life cycle with respect to the quantitative objectives. Focusing the expected model component on these objectives achieves better alignment between the model and training.
  4. The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  5. The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  6. The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  7. The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  8. The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  9. The full wording of Item #4’s rationale is: Similar to the proposed change for QPM SP 1.4, the word “monitor” is much too weak a word; “analyze” is a word that better reflects the activity expected to be performed by high maturity organizations. Similar to the proposed change for QPM SP 1.4, it is suggested that the performance of selected subprocesses be used to “predict” rather than merely “determine” their capability to satisfy their quality and process-performance objectives. “Predict” implies a higher degree of sophistication than does “determine,” and is more closely aligned with the expected behavior of high maturity organizations. The current practice expects project personnel to “identify corrective action;” the propose practice expects them to “identify and take corrective action”. The informative material of this practice should be expanded to refer to the proposed practice SP 1.5 and its use of process-performance models to “identify, analyze, and execute corrective action when necessary.” It should also refer to refer to PMC SG 2 for corrective action that does not warrant the use of process-performance models.The informative material should also be enhanced to discuss managing the inherent variation of the measurement system to heighten the probability that the measurement system is providing more “signal” than “noise.”
  10. The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  11. The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  12. Full explanation of Rationale: The current wording of SG 1 is somewhat passive as it focuses on meeting current quality and process-performance objectives. It implies that once the objectives are being met, the urgency for ongoing improvement is diminished. The ML5 concept of “optimizing” demands that organizations continuously and proactively seek ways to exceed, not merely meet, these expectations (i.e., once they’ve achieved “world class” they then strive for “universe class!”) In addition to proactively soliciting improvement proposals, ML5 organizations should experiment with both existing and emerging technologies in an effort to push their quality and process performance to the next level. The proposed changes to the specific practices supporting SG 1 also reflect this more proactive posture. Note: new informative material should be added to every OID specific practice that uses the word “quantitatively.” Although not explicitly stated in the practices themselves, the informative material should strongly encourage the use of process performance baselines and statistical methods where appropriate.
  13. The full wording of the Rationale is: The proposed wording of SP 1.1 introduces the following changes:a. “Collect” is replaced by the more proactive verb, “Solicit.”b. The “analyze” portion of this practice is deferred to OID SP 1.3 and SP 1.4.c. The practice explicitly targets “incremental” improvements thereby differentiating SP 1.1 more clearly from the “innovative” improvements covered by SP 1.2.
  14. The full wording of the rationale is: The proposed wording of SP 1.2 introduces the following changes:a. “Identify” is replaced by the more proactive verbs, “seek and investigate.”b. The “analyze” portion of this practice is deferred to OID SP 1.3 and SP 1.4.c. The practice explicitly targets improvements that significantly enhance performance. Innovative change is disruptive and may not be warranted to achieve marginal benefits.
  15. The full wording of the rationale is: The proposed wording of SP 1.4 introduces the following changes:a. Use of the term, “as appropriate” was added to indicate that not all incremental or innovative improvements need to be piloted.b. “…to select which ones to implement” is somewhat mealy and does not reflect the the expected behavior of an ML5 organization. The reworded practice is much more explicit in this regard.
  16. The full wording of Item #4 and its rationale is: Proposed:Add informative material that addresses improving process performance such as that found in the example box associated with OPP SP 1.3, subpractice 1, “Decrease the cost of maintenance of the products by a specified %.” (And OPP SP 1.3, subpractice 2, “Shorten time to delivery to a specified % of the process-performance baseline.” ) The new informative material should indicate that, at ML4, achieving such improvement might be addressed (or at least attempted) via OPF and GP3.1, while at ML5, such improvement is more likely to be achieved through CAR, OID, and GP5.2.It may be best to add this new informative material as one or more subpractices for SP 1.3 (proposed to become SP 1.1), Establish Quality and Process-Performance Objectives and/or as a paragraph supplementing OPP SP 1.4, subpractice 5, “Compare the organization’s process-performance baselines to the associated objectives.” In addition, the possibility of revising the objectives to align more closely with the known level of process performance should also be discussed in this new informative material. Rationale:In order to delineate CL4/ML4 from CL5/ML5 more clearly, the model should avoid implying that common causes of variation are systematically addressed at CL4/ML4. The SP 1.3 examples indicated above would require changing the currently stable range of process performance, a concept more closely aligned with the ML5 process areas. Similar additions to the informative material will be proposed in ATLAS #12B for QPM as well.
  17. The full wording of the rationale is: The wording of the existing practice provides little direction as to what should be measured and how the measures are to be used. The subpractices extend the measures beyond “the effects of the … improvements” suggesting that actual cost, effort, and schedule for deploying each improvement be captured as well.The proposed wording of SP 2.3 indicates that the organization should evaluate the improvement using the same metrics that were predicted via process modeling and/or initially achieved during the pilots.Furthermore, new informative material should indicate that this evaluation may result in the need to adjust the implementation of recently deployed improvements, to enhance the associated tailoring guidelines, and/or to initiate other forms of corrective action. It’s not enough to simply “measure the effects;” rather, the organization should strive to achieve the benefits.Finally, new informative material should remind the organization to update the corresponding process performance baselines and models based on the quantitative results achieved by the deployed improvements. If quality and process performance has, indeed, been improved, then the organization should expect to continue deriving these benefits in the future.