The Integrated Master Plan (IMP) and Integrated Master Schedule( (IMS) provide a strategy for the incremental delivery of program outcomes through increasing maturity assessments with Measures of Effectiveness, Measures of Performance, Technical Performance Measures, and Key Performance Parameters.
These assessment assure the needed capabilities of the project are met at each assessment point to confirm physical percent complete as planned in the Integrated Master Plan
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
The integrated master plan and integrated master schedule
1. The
Integrated
Master
Plan
And
Integrated
Master
Schedule
The
Integrated
Master
Plan
and
Integrated
Master
Schedule
are
one
of
the
six
elements
of
a
Credible
Performance
Measurement
Baseline
(PMB).
The
PMB
is
the
source
of
data
for
the
Program
Manager.
Some
of
this
data
is
used
in
the
Integrated
Program
Measurement
Report
(IPMR).
Some
is
used
in
the
assessment
of
the
Program’s
risk.
Some
define
the
deliverables
and
their
Technical
Performance
Measures.
1
5
V8.5
2. The
IMP
tells
us
where
is
the
program
going?
The
Plan
describes
where
we
are
going,
the
various
paths
we
can
take
to
reach
our
desLnaLon,
and
the
progress
or
performance
assessment
points
along
the
way
to
assure
we
are
on
the
right
path.
These
assessment
points
measures
the
“maturity”
of
the
product
or
service
against
the
planned
maturity.
This
is
the
only
real
measure
of
progress
–
not
the
passage
of
Lme
or
consumpLon
of
money.
The
Integrated
Master
Plan
(IMP)
Is
A
Strategy
For
The
Successful
Comple=on
Of
The
Project
2
5.0
Start
with
the
IMP
4. Quick
View
to
IMP/IMS
! VerLcal
traceability
defines
the
increasing
maturity
of
key
deliverables
! Horizontal
traceability
defines
the
work
acLviLes
needed
to
produce
this
increasing
maturity
! Both
are
needed,
but
the
verLcal
traceability
is
the
starLng
point
! Program
Events,
Significant
Accomplishments,
and
Accomplishment
Criteria
must
be
defined
before
the
horizontal
work
acLviLes
can
be
idenLfied
! For
all
IMP
elements,
Key
Risks
must
be
idenLfied
and
assigned
from
Day
One,
even
without
miLgaLons
4
5.0
Start
with
the
IMP
5. The
IMP/IMS
is
Needed
on
Both
Sides
of
the
Contract
! Ver%cal
traceability
defines
the
increasing
maturity
of
the
program’s
deliverables
measures
in
EffecLveness
(MoE)
and
Performance
(MoP)
for
the
Government.
! Horizontal
traceability
defines
the
progress
to
plan
for
MoE’s
and
MoP’s
with
tangible
evidence
for
both
the
Government
and
the
Contractor.
! Ver%cal
traceability
provides
the
Government
with
insight
into
the
progress
of
the
MoE’s
and
MoP’s
and
Technical
Performance
Measures.
! Horizontal
traceability
provides
insight
into
Cost
and
Schedule
performance
for
the
Contractor,
reportable
through
the
IPMR
to
the
Government.
5
5.0
Start
with
the
IMP
6. MisconcepLons
of
the
IMP
/
IMS
Why
we
don’t
need/want
an
IMP/IMS
! Only
required
for
ACAT
I
programs
! Too
big
and
burdensome
for
our
small
dollar
value
program
! Contractor
spends
B&P
and
program
budget
generaLng
and
maintaining
the
IMP,
without
measurable
benefit
! Doesn’t
apply
on
a
services
contractors
! Management
tool,
not
a
technical
tool
! Doesn’t
apply
to
technology
programs
! Doesn’t
apply
to
R&D
efforts
! Doesn’t
apply
to
the
government
! Help
me
get
a
waiver
so
I
don’t
have
to
use
an
IMP/IMS
6
5.0
Start
with
the
IMP
7. Aeributes
of
IMP
! Traceability
– Expands
and
complies
with
the
SOO,
Performance
Requirements,
CWBS,
and
CSOW
– Based
on
the
customers
WBS
– Is
the
basis
of
the
IMS,
cost
reports,
and
award
fees
! Implements
a
measurable
and
trackable
program
– Accomplishes
integrated
product
development
– Integrates
the
funcLonal
acLviLes
of
the
program
– Incorporates
funcLonal,
lower
level
and
S/C
IMPs
! Provides
for
evaluaLon
of
Program
Maturity
– Provides
insight
into
the
overall
effort
– Level
of
detail
is
consistent
with
risk
and
complexity
per
§L
– Decomposes
events
into
a
logical
series
of
accomplishments
– Measurable
criteria
demonstrate
compleLon
/
quality
of
accomplishments
7
5.0
Start
with
the
IMP
8. Aeributes
of
the
IMS
! Integrated,
networked,
mulL-‐layered
schedule
of
efforts
required
to
achieve
each
IMP
accomplishment
– Detailed
tasks
and
work
to
be
completed
– Calendar
schedule
shows
work
compleLon
dates
– Network
schedule
shows
interrelaLonships
and
criLcal
path
– Expanded
granularity,
frequency,
and
depth
of
risk
areas
! Resource
loading
! Correlates
IMS
work
with
IMP
events
8
5.0
Start
with
the
IMP
9. The
Importance
of
the
IMP
! The
IMP/IMS
is
the
single
most
important
document
to
a
program’s
success
– It
clearly
demonstrates
the
providers
understanding
of
the
program
requirements
and
the
soundness
of
the
approach
a
represented
by
the
plan
! The
program
uses
the
IMP/IMS
to
provide:
– Up
Front
Planning
and
commitment
from
all
parLcipants
– A
balanced
design
discipline
with
risk
miLgaLon
acLviLes
– Integrated
requirements
including
producLon
and
support
– Management
with
an
incremental
verificaLon
for
informed
program
decisions
9
5.0
Start
with
the
IMP
10. Just
A
Reminder,
before
moving
on
10
Page
47,
Defense
AcquisiLon
Guide,
January
10,
2012
Page
317,
Defense
AcquisiLon
Guide,
January
10,
2012
5.0
Start
with
the
IMP
11. Our
Goal
is
simple
…
How
can
we
recognize
the
Reality
of
the
Program’s
current
status
and
its
future
performance?
The
top
spins
conLnuous
while
in
a
dream
–
stops
spinning
in
the
real
world
–
Cobb’s
totem,
Incep=on
11
5.0
Start
with
the
IMP
13. Principles
of
Building
a
Credible
Integrated
Master
Plan
Building
the
IMP
is
a
Systems
Engineering
ac<vity.
The
Integrated
Master
Plan
is
the
Program
Architecture
in
the
same
way
the
hardware
and
soBware
are
the
Product
Architecture.
Poor,
a
weak,
or
unstructured
Programma<c
Architecture
reduces
visibility
to
the
Product
Architecture’s
performance
measures
of
cost
and
schedule
connected
with
Technical
Performance
Measures.
1
6
V8.5
14. Quick
View
of
Building
the
IMP
! Start
with
each
Program
Event
and
define
the
Significant
Accomplishments
their
entry
and
exit
criteria
to
assess
the
needed
maturity
of
the
key
deliverables
! Arrange
the
Significant
Accomplishments
in
the
proper
dependency
order
! Segregate
these
Significant
Accomplishments
into
swim
lanes
for
IPTs
! Define
the
dependencies
between
each
SA
2
6.0
Build
IMP
15. A
Cri<cal
Understanding
of
the
IMP
3
The
IMP
defines
the
connec<ons
between
the
Product
maturity
–
Ver<cal
–
and
the
implementa<on
of
this
Product
maturity
through
the
Func<onal
ac<vi<es
–
the
Horizontal
6.0
Build
IMP
16. Benefits
of
this
Formality
4
Objec&ve
Implementa&on
Event
Driven
Plan
versus
Schedule
Driven
Plan
is
based
on
comple<on
of
tasks
not
passage
of
<me
Separate
the
plan
(IMP)
from
the
schedule
(IMS)
but
link
elements
with
numbering
system
Condensed,
easy
to
read
“Plan”
showing
the
“Events”
rather
than
the
work
effort
Indentured,
outline
format
–
not
text
Pre-‐defined
entry
and
exit
criteria
for
major
Program
Events
Significant
Accomplishments
for
each
key
Program
Event
Objec<ve
measures
of
progress
and
comple<on
for
each
Accomplishment
Pre-‐defined
Accomplishment
Criteria
(AC)
for
each
Significant
Accomplishment
(SA)
Stable,
contracture
plan,
flexible
enough
to
portray
program
status
IMP
part
of
the
contract,
IMS
is
a
data
item
Capture
essence
of
the
func<onal
progress
without
manda<ng
a
par<cular
process
for
performing
the
work
Split
IMP
into
Product
and
Process
6.0
Build
IMP
17. Risk
Management
Building
the
IMP
Starts
at
the
RFP
5
SOW
SOO
ConOps
WBS
Techncial
and
Opera<onal
Requirements
CWBS
&
CWBS
Dic<onary
Integrated
Master
Plan
(IMP)
Integrated
Master
Schedule
(IMS)
Earned
Value
Management
System
Objec<ve
Status
and
Essen<al
Views
to
support
the
proac<ve
management
processes
needed
to
keep
the
program
GREEN
Performance
Measurement
Baseline
Measures
of
Effec<veness
Measures
of
Performance
Measures
of
Progress
JROC
Key
Performance
Parameters
Program
Specific
Key
Performance
Parameters
Technical
Performance
Measures
1.0
Introduc<on
18. The
IMP
/
IMS
Structure
6
IMS
IMP
Describes how program
capabilities will be
delivered and
how these
capabilities will
be recognized
as ready for
delivery
Supplemental Schedules
Work Packages and Tasks
Criteria
Accomplishment
Events
or
Milestones
1
6.0
Build
IMP
19. The
IMP/IMS
provides
Horizontal
and
Ver<cal
Traceability
of
progress
to
plan
! Ver<cal
traceability
AC
"
SA
"
PE
! Horizontal
traceability
WP
"
WP
"
AC
7
Program Events
Define the maturity
of a Capability at a point in
time.
Significant Accomplishments
Represent requirements
that enable Capabilities.
Accomplishment Criteria
Exit Criteria for the Work
Packages that fulfill Requirements.
Work
Package
Work
Package
Work
Package
Work
Package
Work
Package
Work
Package
Work
package
6.0
Build
IMP
20. The
IMP’s
role
during
Execu<on
8
Program ExecutionPMB for IBRProposal SubmittalDRFP & RFP
Performance Measurement Baseline
Tasks (T)
BOE
% Complete
Statement of Work
Program Deliverables
IMP
Accomplishments (A)
Criteria (C)
EVMS
Events (E)
Budget Spreads by CA & WPCAIV
Capabilities Based Requirements
X BCWS =
Probabilistic Risk Analysis
=
Time keeping and ODC =
Technical Performance Measure
BCWP
ACWP
Cost & Schedule Risk Model
BCWS
Decreasing technical and programmatic risk using Risk Management Methods
IMS
Physical % Complete
Continuity and consistency from DRFP through Program Execution
WBS
6.0
Build
IMP
21. 1st,
Principle
–
IMP
Building
is
a
Full
Contact
Sport
9
6.0
Build
IMP
22. Our
First
Approach
to
the
IMP
/
IMS
Paradigm
! The
1st
approach
defines
Program
Events
(PE),
Significant
Accomplishments
(SA),
and
Accomplishment
Criteria
(AC),
derived
from
the
Work
Breakdown
Structure
or
the
SOW
! This
1st
approach
can
be
done
in
6
easy
steps
10
6.0
Build
IMP
1. Iden<fy
the
Program
Events
(PE)
(as
the
ACQ
Guide
tells
us)
2. Iden<fy
the
Significant
Accomplishments
(SA)
from
the
WBS
deliverables
3. Iden<fy
the
Accomplishment
Criteria
(AC)
needed
to
produce
these
deliverables
4. Iden<fy
the
Work
Packages
for
each
Accomplishment
Criteria
5. Sequence
the
Work
Packages
6. Assemble
the
IMP/IMS
23. This
approach
doesn’t
give
us
visibility
into
what
“done”
looks
like
! We
must
measure
increasing
product
maturity
in
units
meaningful
to
the
decision
makers
! We
must
see
the
risks
before
they
arrive
so
we
can
take
correc<ve
ac<on
11
6.0
Build
IMP
24. First
Look
at
a
Significant
Accomplishment
(SA)
! SAs
are
interim
and
final
steps
to
define,
design,
develop,
verify,
produce,
and
deploy
the
product
or
system.
! SAs
must
occur
in
a
manner
that
ensures
a
logical
path
is
maintained
throughout
the
development
effort.
! SAs
are
event
related
and
not
just
<me
coincidental.
! SAs
should
have
one
or
more
of
the
following
characteris<cs:
– Consists
of
a
discrete
step
in
the
process
of
planned
development
that
must
be
complete
prior
to
an
event
– Produces
a
desired
result
at
a
specified
event
that
indicates
a
level
of
design
maturity
(or
progress)
directly
related
to
each
product
and
process
– Defines
interrela<onships,
interdependencies,
or
“hand-‐off”
points
of
different
func<onal
disciplines
applied
to
the
program
12
6.0
Build
IMP
25. First
look
at
an
Accomplishment
Criteria
(AC)
! ACs
are
defini<ve
measures
directly
suppor<ng
successful
comple<on
of
a
significant
accomplishment.
! ACs
show
objec<ve
evidence
of
work
progress
(maturity
of
a
product);
i.e.,
be
seen,
read,
demonstrated,
or
quan<fied.
These
results
are
usually
incorporated
into
a
report
or
document
as
evidence
of
accomplishment.
! ACs
are
prerequisites
for
comple<on
of
an
SA
(i.e.,
exit
criteria).
! The
ques<ons
that
need
to
be
repeatedly
asked
when
developing
ACs
are:
– How
do
I
know
when
an
accomplishment
has
been
completed?
– Is
the
criteria
directly
related
to
the
accomplishment?
– Is
it
proof?
– What
is
the
work
product?”
13
6.0
Build
IMP
26. The
IMP
speaks
to
Measures
of
Effec<veness
(MoE)
and
Measures
of
Performance
(MoP)
14
6.0
Build
IMP
! This
is
where
TPMs
are
connected
with
the
MoE’s
and
MoP’s
! For
each
deliverable
from
the
program,
all
the
“measures”
must
be
defined
in
units
meaningful
to
the
decision
makers.
! Here’s
some
“real”
examples.
1. Provide
Precision
Approach
for
a
200
FT/0.5
NM
DH
2. Provide
bearing
and
range
to
AC
plaporm
3. Provide
AC
surveillance
to
GRND
plaporm
Measures
of
Effec<veness
(MoE)
1. Net
Ready
2. Guidance
Quality
3. Land
Interoperability
4. Manpower
5. Availability
JROC
Key
Performance
Parameters
(KPP)
1. Net
Ready
! IPv4/6
compliance
! 1Gb
Ethernet
2. Guidance
quality
! Accuracy
threshold
p70
@
6M
!
Integrity
threshold
4M
@
10-‐6
/approach
3. Land
interoperability
! Processing
capability
meets
LB
growth
matrix
4. Manpower
! MTBC
>1000
hrs
! MCM
<
2
hrs
5. Availability
! Clear
threshold
>99%
! Jam
threshold
>90%
Measures
of
Performance
(MoP)
1. Net
Ready
! Standard
message
packets
2. Guidance
Quality
! Mul<path
alloca<on
budget
! Mul<path
bias
protec<on
3. Land
Interoperability
! MOSA
compliant
! Civil
compliant
4. Manpower
! Opera<ng
elapsed
<me
meters
! Standby
elapsed
<me
indicators
5. Availability
! Phase
center
varia<ons
Technical
Performance
Measures
(TPM)
Mission
Capabili<es
and
Opera<onal
Need
Technical
Insight
–
Risk
adjusted
performance
to
plan
27. The
1st
Problem
with
the
Ini<al
IMP/IMS
Paradigm
! There
is
no
single
source
of
guidance
for
construc<ng
a
credible
IMP
and
IMS
– A
DOD
Guidebook,
but
s<ll
in
Version
0.9
– A
few
DOD
service
pamphlets
– A
commercial
guidebook
– Many
contractor
guidelines
! No
defini<ve
guidance
from
DoD
on
what
makes
a
good
IMP
! No
defini<ve
DID
requiring
an
IMP
on
specific
programs
15
6.0
Build
IMP
28. The
IMP
is
a
good
start,
but
we’re
really
aBer
the
IMP
Narra<ves
! The
IMP
Narra<ves
start
with
the
Statement
of
Objec<ves
(SOO)
or
Concept
of
Opera<ons
(ConOps)
– These
iden<fy
the
top
level
program
objec<ves
– They
define
the
big
picture
and
provide
pre-‐award
trade
space
– They
provide
the
framework
for
the
Contractor
to
develop
the
proposal
through
the
IMP
! With
the
Government’s
Execu<ve
summary,
provides
the
contractor
an
understanding
of
what
is
needed
and
what
is
important
16
6.0
Build
IMP
29. The
IMP
Process
Narra<ve
The
IMP
Process
Narra<ve
describes
how
the
technical
and
business
elements
of
the
program
will
be
conducted,
monitored,
and
controlled.
Narra<ves
provide
the
customer
visibility
into
the
contractor’s
key
func<onal
processes
and
procedures,
the
rela<onship
of
these
processes
and
procedures,
and
an
overview
of
the
effort
required
to
implement
them.
17
6.0
Build
IMP
30. IMP
Narra<ve
for
PDR
Program
Event
Event
Descrip&on
PE
Maturity
Assessment
shown
in
SAs
PDR
PDR
establishes
the
“design-‐
to”
allocated
baseline
to
the
subsystem
level,
assures
this
design
meets
the
func<onal
baseline,
assures
system
requirements
have
been
properly
allocated
to
the
proper
subsystem.
PDR
establishes
the
feasibility
of
the
design
approach
to
meet
the
technical
requirements
and
provide
acceptable
interface
rela<onships
between
the
hardware
and
other
interfacing
items.
Any
changes
to
the
requirements
that
have
occurred
since
the
System
Requirements
Review
(SRR2)
will
be
verified
at
the
PDR.
PDR
assures
the
design
is
verifiable,
does
not
pose
major
IMS
or
Cost
risk,
and
is
mature
enough
to
advance
to
the
detailed
design
phase
–
CDR
! Subsystem
level
opera<onal
concepts
defined
! System
level
interfaces
baselined
! Supportability
plans
established
! SoBware
requirements
finalized
! Subsystem
requirements
finalized
&
allocated
! System
verifica<on,
valida<on
&
cer<fica<on
plans
updated
! PDR
subsystem
design
completed
18
6.0
Build
IMP
31. What
the
IMP
Narra<ve
Tells
Us?
! States
the
objec<ve
of
the
processes
used
to
build
the
products
! Provides
the
governing
documents,
compliance,
and
reference
for
the
process
ac<vi<es
! Explains
the
process
approach
– Portrays
the
key
ac<vi<es
of
the
approach
– Illustrates
the
processes
tailored
for
the
specific
program
19
Inputs
Ac<vity
3
Ac<vity
5
Ac<vity
4
Tools
Ac<vity
1
Ac<vity
2
Outputs
Metrics
6.0
Build
IMP
35. 5000.01
Part
2.B.3
Acquisi<on
Strategies,
Exit
Criteria,
and
Risk
Management
“Event
driven
acquisi<on
strategies
and
program
plans
must
be
based
on
rigorous,
objec<ve
assessments
of
a
program’s
status
and
the
plans
for
managing
risk
during
the
next
phase
and
the
remainder
of
the
program.
The
acquisi<on
strategy
and
associated
contrac<ng
ac<vi<es
must
explicitly
link
milestone
decision
reviews
to
events
and
demonstrated
accomplishments
in
development,
tes<ng,
and
ini<al
produc<on.
The
acquisi<on
strategy
must
reflect
the
interrela<onships
and
schedule
of
acquisi<on
phases
and
events
based
on
logical
sequence
of
demonstrated
accomplishments
not
on
fiscal
or
calendar
expediency.”
23
6.0
Build
IMP
36. What
makes
a
good
Program
Event
(PE)?
! Events
are
the
conclusion
of
an
interval
of
major
program
accomplishments
(SA)
with
their
criteria
(AC)
! IMP
events
represent
key
decision
transi<on
points
between
major
ac<vi<es
distributed
over
the
contract
period
– The
IMP
is
a
Mini
Authoriza;on’s
to
Proceed
(ATP)
to
the
next
Program
Event
(PE)
! Some
guidance
for
establishing
program/product
events:
– Customer
Given
Events.
– Key
Decisions
Needed
– Risk
Mi<ga<on
Event
– DOD
Systems
Engineering
Technology
Review
(SETR)
Guidance
24
6.0
Build
IMP
37. What
makes
a
good
Significant
Accomplishment
(SA)?
! IPT
can
manage
it
at
a
working
level
! Shows
comple<on
and
result
s
of
discrete
steps
in
the
development
process
! Indicates
maturity
of
the
product
through
MoEs
and
MoPs
! Its
“significance”
measures
program
event
status
! Relevant
and
logically
linked
to
the
proper
PE
– Just
because
the
work
occurs
during
the
<me-‐frame
for
PE
A
doesn’t
mean
it’s
logically
linked
to
PE
A
! Uses
consistent
language,
style,
format
through
verb
dic<onary,
for
example:
– Segment
build
4
detailed
design
completed
– Analysis
of
structural
integrity
completed
– Structural
integrity
verified
25
6.0
Build
IMP
38. IMP
Significant
Accomplishments
! SAs
are
NOT
just
a
list
of
“things”
to
do
before
the
Program
Event
(PE)
! They
are
sequenced
accomplishments,
each
of
which
leads
to
the
PE,
e.g.
Cri<cal
Design
Review
(CDR)
– SA
#
1
=
CDR
mee<ng
conducted
– SA
#
2
=
CDR
ac<on
item
work-‐off
plan
established
– SA
#
3
=
85%
drawings
completed
– SA
#
4
=
CDR
CDRLs
delivered
– SA
#
5
=
Development
environment
opera<onal
– SA
#
6
=
Cri<cal
methods
analyses
completed
– SA
#
7
=
RVTM
approved
26
6.0
Build
IMP
39. What
makes
a
good
Accomplishment
Criteria
(AC)?
! Provides
objec<ve,
measureable
and,
explicit
proof
of
comple<on
and
closure
of
the
work
ac<vi<es
! Defines
condi<ons
for
closing
the
Significant
Accomplishment
(SA)
! Answers
the
ques<on
how
do
we
know
when
a
Significant
Accomplishment
has
been
completed?
27
6.0
Build
IMP
40. ! Not
Significant
– Too
small
to
significantly
contribute
to
successful
event
comple<on
– Would
lead
to
trivial
tasks
(e.g.,
1
day
dura<on)
! Ambiguous
– Read
can’t
tell
what
Done
looks
like
! Wrong
verb
– Uses
a
verb
that’s
not
on
the
list
(Dic<onary)
– Uses
a
listed
verb
incorrectly
– Doesn’t
have
a
verb
at
all
! Not
measurable
– Can’t
tell
when
we’re
done
! Too
Many
SAs
or
ACs
– Confuses
the
reader
and
confuses
the
execu<on
process
of
the
program
– Dilutes
the
MoE
and
MoP
– Reduces
visibility
into
increasing
maturity
28
What
makes
a
Not
So
Good
SA
or
AC?
6.0
Build
IMP
41. ! Program
Event
(PE)
– A
PE
assess
the
readiness
or
comple<on
as
a
measure
of
progress
– First
Flight
Complete
! Significant
Accomplishment
(SA)
– The
desired
result(s)
prior
to
or
at
comple<on
of
an
event
demonstrate
the
level
of
the
program’s
progress
– Flight
Test
Readiness
Review
Complete
! Accomplishment
Criteria
(AC)
– Defini<ve
evidence
(measures
or
indicators)
that
verify
a
specific
accomplishment
has
been
completed
– SEEK
EAGLE
Flight
Clearance
Obtained
29
F-‐22
Example
6.0
Build
IMP
42. ! PE’s
are
easy,
they
are
in
the
SETR
and
Integrated
Logis<cs
Lifecycle
Management
System
! The
SA’s
can
be
defined
from
the
program’s
deliverables
–
these
may
not
be
obvious
but
can
be
discovered
with
a
Product
Development
Kaizen
process
(more
later)
! It’s
the
AC’s
that
are
hard
part
–
the
ACs
must
represent
the
exit
criteria
for
the
series
of
Work
Packages
that
do
the
work
to
produce
the
product
30
Now
the
Hard
Part
6.0
Build
IMP
43. 6
Steps
to
IMP
Development
Let’s
start
to
build
the
IMP.
This
step-‐by-‐step
process
needs
to
be
followed
carefully.
The
IMP
is
constructed
one
Program
Event
at
a
Cme
–
LeE
to
Right
in
Cme.
To
do
otherwise
allows
confusion
and
disconnecCon
between
Program
Events
to
occur
and
dilutes
our
focus
on
defining
what
Done
looks
like
for
each
Program
Event.
1
7
V8.5
45. Quick
View
of
Step-‐By-‐Step
IMP
IdenCfy
Program
Events
IdenCfy
Significant
Accomplishments
IdenCfy
Accomplishment
Criteria
IdenCfy
Work
Packages
needed
to
complete
the
Accomplishment
Criteria
Sequence
the
Work
Packages
(WP),
Planning
Packages
(PP),
Summary
Level
Planning
Packages
(SLPP)
in
a
logical
network.
Adjust
the
sequence
of
WPs,
PPs,
&
SLPPs
to
miCgate
major
risks.
3
7.0
6
Steps
1
2
3
4
5
6
47. Outcomes
of
Step
! Confirm
the
end
to
end
descripCon
of
the
increasing
maturity
of
the
program’s
deliverables
! Establish
of
RFP
or
Contract
target
dates
for
each
Event.
! Socialize
the
language
of
speaking
in
“Events”
rather
than
Cme
and
efforts
5
1
7.0
6
Steps
48. Events
Define
the
Assessment
of
the
Program’s
Maturity
6
! Program
Events
are
maturity
assessment
points
in
the
program
! They
define
what
levels
of
maturity
for
the
products
and
services
are
needed
before
proceeding
to
the
next
maturity
assessment
point
! The
entry
criteria
for
each
Event
defines
the
units
of
measure
for
the
successful
compleCon
of
the
Event
! The
example
below
is
typical
of
the
purpose
of
a
Program
Event
The
Cri+cal
Design
Review
(CDR)
is
a
mul+-‐disciplined
product
and
process
assessment
to
ensure
that
the
system
under
review
can
proceed
into
system
fabrica+on,
demonstra+on,
and
test,
and
can
meet
the
stated
performance
requirements
within
cost
(program
budget),
schedule
(program
schedule),
risk,
and
other
system
constraints.
1
7.0
6
Steps
49. IdenCfy
the
Significant
Accomplishments
(SA)
for
Each
Program
Event
(PE)
7
Actors
Processes
Outcomes
System
Engineer
IdenCfy
Integrated
Product
Teams
(IPT)
responsible
for
the
SA’s
! Define
the
boundaries
of
these
programmaCc
interfaces
! Define
technical
and
process
risk
categories
and
their
bounds
Technical
Lead
Confirm
the
sequence
of
SA’s
has
the
proper
dependency
relaConships
! Define
the
product
development
flow
process
improves
maturity
! Define
technical
risk
drivers
Project
Engineer
Confirm
logic
of
SA’s
for
project
sequence
integrity
! Define
the
program
flows
improves
maturity
Control
Account
Manager
Validate
SA
outcomes
in
support
of
PE
entry
condiCons
! Confirm
budget
and
resources
adequate
for
defined
work
effort
IMP/IMS
Architect
Assure
the
assessment
points
provide
a
logical
flow
of
maturity
at
the
proper
intervals
for
the
program
! Maintain
the
integrity
of
the
IMP,
WBS,
and
IMS
2
7.0
6
Steps
50. Outcomes
of
Step
! The
Significant
Accomplishments
are
the
“road
map”
to
the
increasing
maturity
of
the
program
! The
“Value
Stream
Map”
resulCng
from
the
flow
of
SA’s
describes
how
the
products
or
services
move
through
the
maturaCon
process
while
reducing
risk
! The
SA
map
is
the
path
to
“done”
8
2
7.0
6
Steps
51. SAs
define
the
entry
criteria
for
each
Program
Event
9
Preliminary
Design
Review
Complete
3
7.0
6
Steps
52. IdenCfy
Accomplishment
Criteria
(AC)
for
each
Significant
Accomplishment
(SA)
Actors
Processes
Outcomes
CAM
Define
and
sequence
the
contents
of
each
Work
Package
and
select
the
EV
criteria
for
each
Task
needed
to
roll
up
the
BCWP
measurement
! Establish
ownership
for
the
content
of
each
Work
Package
and
the
Exit
Criteria
–
the
Accomplishment
Criteria
(AC)
Project
Engineer
IdenCfy
the
logical
process
flow
of
the
Work
Package
to
assure
the
least
effort,
maximum
value
and
lowest
risk
path
to
the
Program
Event
! Establish
ownership
for
the
process
flow
of
the
product
or
service
Technical
Lead
Assure
all
technical
processes
are
covered
in
each
Work
Package
! Establish
ownership
for
the
technical
outcome
of
each
Work
Package
IMP/IMS
Architect
Confirm
the
process
flow
of
the
ACs
can
follow
the
DID
81650
structuring
and
Risk
Assessment
processes
! Guide
the
development
of
outcomes
for
each
Work
Package
to
assure
increasing
maturity
of
the
program
10
3
7.0
6
Steps
53. Outcomes
of
Step
! The
definiCon
of
“done”
emerges
in
the
form
of
deliverables
rather
than
measures
of
cost
and
passage
of
Cme.
! At
each
Program
Event,
the
increasing
maturity
of
the
deliverables
is
defined
through
the
Measures
of
EffecCveness
(MoE)
and
Measures
of
Performance
(MoP)
11
3
7.0
6
Steps
54. ACs
are
higher
fidelity
descripCons
of
“Done”
than
SAs
12
Cri9cal
Design
Review
Complete
4
7.0
6
Steps
55. IdenCfy
the
Work
for
Each
Accomplishment
Criteria
in
Work
Packages
Actors
Processes
Outcomes
Control
Account
Manager
IdenCfy
or
confirm
the
work
acCviCes
in
the
Work
Package
represent
the
allocated
work
! Define
bounded
work
effort
defined
“inside”
each
Work
Package
Technical
Lead
Confirm
this
work
covers
the
SOW
and
CDRLs
! Define
all
work
effort
for
100%
compleCon
of
deliverable
visible
in
a
single
locaCon
–
the
Work
Package
! Confirm
risk
drivers
and
duraCon
variances
IMP/IMS
Architect
Assist
in
the
sequencing
the
work
efforts
in
a
logical
manner
! Develop
foundaCon
of
the
maturity
flow
starCng
to
emerge
from
the
contents
of
the
Work
Packages
Earned
Value
Analyst
Assign
iniCal
BCWS
from
BOE
to
Work
Package
! ConfirmaCon
of
work
effort
against
BOEs
! Define
EVT
for
measures
progress
to
plan
13
4
7.0
6
Steps
56. Outcomes
of
Step
! The
work
idenCfied
that
produces
a
measurable
outcome.
! This
work
defined
in
each
Work
Package
! The
Accomplishment
Criteria
(AC)
state
explicitly
what
“done”
looks
like
for
this
effort
! With
“done”
stated,
measures
of
Performance
and
measures
of
EffecCveness
can
be
defined
14
4
7.0
6
Steps
57. Work
is
done
in
“packages”
that
produce
measureable
outcomes
15
Launch
Readiness
Review
Complete
5
7.0
6
Steps
58. Sequence
Work
Packages
(ACs)
for
each
Significant
Accomplishment
(SA)
16
Actors
Processes
Outcomes
Control
Account
Manager
Define
the
order
of
the
Work
Packages
needed
to
meet
the
Significant
Accomplishments
for
each
Program
Event
! Define
the
process
flow
of
work
and
the
resulCng
accomplishments.
! Assure
value
is
being
produced
at
each
SA
and
the
AC’s
that
drive
them
IMP/IMS
Architect
Assure
that
the
sequence
of
Work
Packages
adheres
to
the
guidance
provided
by
DCMA
and
the
EVMS
System
descripCon
! Begin
the
structuring
of
the
IMS
for
compliance
and
loading
into
the
cost
system
Program
Controls
Staff
Baseline
the
sequence
of
Work
Packages
using
Earned
Value
Techniques
(EVT)
with
measures
of
Physical
Percent
Complete
! Develop
insight
to
progress
to
plan
with
measures
of
physical
progress
for
each
Work
Packages
(EVT)
5
7.0
6
Steps
59. Outcomes
of
Step
! Work
Packages
parCCon
work
efforts
into
“bounded”
scope
! Interdependencies
constrained
to
Work
Package
boundaries
prevents
“spaghek
code”
style
schedule
flow
! Visibility
of
the
Increasing
Flow
of
Maturity
starCng
to
emerge
from
the
flow
of
Accomplishment
Criteria
(AC)
17
5
7.0
6
Steps
61. Assemble
Final
IMP/IMS
Actors
Processes
Outcomes
IMP/IMS
Architect
StarCng
with
the
AC’s
under
each
SA’s
connect
Work
Packages
in
the
proper
order
for
each
Program
Event
! Establish
the
Performance
Measurement
Baseline
framework.
! IdenCfy
MoE
and
MoP
points
in
the
IMP
Program
Manager
Confirm
the
work
efforts
represent
the
commiled
acCviCes
for
the
contract
! Review
and
approval
of
the
IMS
–
ready
for
baseline.
! Review
and
approve
risk
drivers
and
duraCon
variance
models
Project
Engineer
Assess
the
product
development
flow
for
opCmizaCons
! Review
and
approval
of
the
IMS
–
ready
for
baseline.
! IdenCfy
risk
drivers
and
their
miCgaCons
Systems
Engineer
Confirm
the
work
process
flows
result
in
the
proper
products
being
built
in
the
right
order
! Confirm
risk
drivers
and
duraCon
variances.
! Review
and
approval
of
the
IMS
–
ready
for
baseline
19
6
7.0
6
Steps
62. Outcomes
of
Step
! Both
the
maturity
assessment
criteria
and
the
work
needed
to
reach
that
level
of
maturity
are
described
in
a
single
locaCon
! Risks
are
integrated
with
the
IMP
and
IMS
at
their
appropriate
levels
– Risks
to
EffecCveness
–
risk
to
JROC
KPPs
– Risks
to
Performance
–
risk
to
program
KPPs
and
TPMs
! Leading
and
Lagging
indicator
data
provide
through
each
measure
to
forecast
future
performance
20
6
7.0
6
Steps
63. The
Previous
6
Steps
Result
In
A
Credible
IMP/IMS
21
! The
IMP
is
the
“Outer
Mold
Line”,
the
Framework,
the
“Going
Forward”
Strategy
for
the
Program.
! The
IMP
describes
the
path
to
increasing
maturity
and
the
Events
measuring
that
maturity.
! The
IMP
tells
us
“How”
the
program
will
flow
with
the
least
risk,
the
maximum
value,
and
the
clearest
visibility
to
progress.
! The
IMS
tells
us
what
work
is
needed
to
produce
the
product
or
service
at
the
Work
Package
level.
Our
Plan
Tells
Us
“How”
We
are
Going
to
Proceed
The
Schedule
Tells
Us
“What”
Work
is
Needed
to
Proceed
7.0
6
Steps
65. Horizontal
and
VerCcal
Traceability
of
the
IMP/IMS
Integrated
Master
Schedule
Work
sequenced
to
produce
outcomes
for
each
WP.
! VerCcal
traceability
AC
"
SA
"
PE
! Horizontal
traceability
WP
"
WP
"AC
Program Events
Define the maturity
of a Capability at a point in
time.
Significant Accomplishments
Represent requirements
that enable Capabilities.
Accomplishment Criteria
Exit Criteria for the Work
Packages that fulfill Requirements.
Work
Package
Work
Package
Work
Package
Work
Package
Work
Package
Work
package
Work
Package
Work
Package
23
7.0
6
Steps
66. The
IMP’s
connecCon
to
the
WBS
! Start
with
the
Significant
Accomplishments
and
sequence
them
to
the
maturity
flow
for
each
Program
Event
! The
WBS
connecCons
then
become
orthogonal
to
this
flow
24
7.0
6
Steps
Program
Event
SRR
SDR
PDR
CDR
TRR
ATLO
Work
Breakdown
Structure
4.920-‐SDAI
A01,
A02
B01
C01,
C02
D01
E01
F01
4.200-‐Sys
Test
A05
B03,
B04
D02,
D03
E02
F02
4.300-‐Radar
A03
B02
C03
E03
4.330-‐O&C
Sys
A06,
A07
B05
C04
D04
E04
F03,
F04
4.400-‐
I&T
A08
C05
E05,
E06
F05
4.500-‐Support
A09
D05
E07
F06,
F07
73. Nuances
Of
These
6
Steps
Building
the
Program
Event,
to
Significant
Accomplishment,
to
Accomplishment
decomposi?on
is
straight
forward.
For
each
Program
Event,
simply
iden?fy
what
are
the
needed
Significant
Accomplishment
for
the
entry
and
exit
criteria,
and
the
Accomplishment
Criteria
for
the
Work
Packages
that
produce
the
AC.
Yea
Right,
no
problem
1
8
V8.5
75. Quick
View
of
the
Nuances
! Unfortunately
building
a
credible
IMP/IMS
is
a
nuanced
process,
subject
to
many
opportuni?es
for
diversions,
blind
alleys,
and
false
starts
! It
is
slightly
counter
intui?ve
from
the
tradi?onal
scheduling
approach
to
start
with
the
ver?cal
integra?on
–
but
it
is
cri?cal
to
start
ver?cally
! Success
requires
the
full
par?cipa?on
of
Systems
Engineering,
CAMs,
and
the
Program
Manager
! Success
requires
everyone
to
understand
the
nuances
of
the
IMP
building
efforts
3
8.0
Nuances
78. The
3rd
Nuance
Everything
Foot
and
Ties
to
the
IMP
&
IMS
Beginner
Intermediate
Advanced
! The
IMS
contains
all
the
proper
fields
in
columns
and
is
horizontally
linked
! The
WBS
elements
can
be
found
for
all
work
elements
! CDRL’s
are
visible
and
their
mul?ple
delivery
dates
connected
to
each
Program
Event
! WBS
is
structured
in
a
product
manner
or
possibly
a
func?onal
manner
with
some
deliverables
defined
in
the
terminal
nodes
! The
WBS
is
properly
formed
inside
each
AC
with
incremental
deliverables
! WBS
numbers
form
a
“well
structured”
tree,
but
s?ll
is
not
“pure”
in
the
sense
of
deliverables
only,
no
func?onal
! Each
column
and
each
field
can
be
“pivoted”
to
form
a
proper
“tree”
of
value
flow.
! The
WBS
is
a
“pure”
Product
Breakdown
Structure
(PBS)
and
the
services
needed
to
produce
those
products
! The
WBS
defines
the
structure
of
the
delivered
product
or
service
! The
Ver?cal
trace
of
the
IPM
describes
the
flow
of
increasing
maturity
of
these
products
or
services
! The
Horizontal
trace
of
the
IMP
describes
to
work
to
be
done
to
produce
this
maturity
6
8.0
Nuances
79. The
4th
Nuance
IMP/IMS
is
Programma?c
Architecture
Beginner
Intermediate
Advanced
! The
IMP
is
built
from
the
WBS
for
each
Program
Event.
! The
IMP
is
seen
as
a
compliance
document
that
lists
the
Program
Events
and
a
“bunch
of
stuff”
underneath.
! The
IMP
is
structured
around
separate
Program
Events,
but
below
the
SA’s
looks
like
a
“shop
floor”
schedule
with
lijle
ver?cal
connec?vity.
! The
IMP
is
built
as
a
“value
stream”
flow
for
the
program
but
the
Systems
Engineers
! This
programma?c
architecture
is
built
in
the
same
way
the
technical
system
architecture
is
built
! It
is
derived
from
the
ConOps
and
Tier
1
System
Requirements
! The
IMP
shows
explicitly
how
these
are
supported
in
the
flow
of
the
SA’s
7
8.0
Nuances
80. The
5th
Nuance
The
IMP/IMS
connects
all
the
dots
8
8.0
Nuances
Measure
of
Effec?veness
Measure
of
Performance
Technical
Performance
Measure
Risk
Aleatory
Uncertainty
Epistemic
Uncertainty
Reference
Classes
Past
Performance
SME
Past
Performance
System
Architecture
AHP
81. Connec&ng
The
IMP
To
Program
Performance
Measures
Assembling
the
IMS
from
the
IMP
appears
to
be
a
straight
forward
process
–
details
the
tasks
that
support
the
Accomplishment
Criteria.
But
there
are
some
cri&cal
steps
that
must
be
done
in
the
right
order
to
end
up
with
a
risk
tolerant
IMS.
Let’s
do
this
for
TSAS
1
9
V8.5
82. The
Primary
Role
for
the
IMP
is
to
describe
what
done
looks
like
in
MoE’s
and
MoP’s
2
19
October
1899
Robert
Goddard
decided
that
he
wanted
to
"fly
without
wings"
to
Moon.
9.0
Framework
83. Quick
View
to
IMP/IMS
Framework
! Measures
of
increasing
maturity
for
the
key
deliverables
is
the
founda&on
for
increasing
the
Probability
of
Program
Success
! Measures
of
Effec&veness
(MoE)
and
Measures
of
Performance
(MoP)
are
defined
in
the
Integrated
Master
Plan
(IMP)
Narra&ve
! Key
Performance
Parameters
(KPP)
–
both
JROC
and
program
specific
are
needed
! Technical
Performance
Measures
(TPM)
are
needed
for
all
key
deliverables
3
9.0
Framework
84. Star&ng
out
on
the
Right
Foot
…
! Most
IMP/IMS
literature
states
how
to
build
an
IMP
from
the
RFP
and
contractual
elements,
in
simple
and
maybe
simple
minded
terms
– Decompose
the
events
into
SAs,
ACs,
and
their
Tasks
–
sounds
easy
! This
approach
fails
to
provide
advice
for
several
things:
– How
to
minimize
the
topological
connec&ons
between
Events
– How
to
increase
the
concurrency
between
IPTs
– How
to
increase
the
tolerance
of
the
IMS
to
disrup&ve
events
• Known
and
knowable
risk
• Unknown
and
possibly
unknowable
risk
! The
construc&on
of
the
IMS
needs
to
take
place
in
what
seems
to
be
a
reverse
order
– Build
the
IMP
as
a
Value
Stream
Map
describing
the
increasing
maturity
–
and
therefore
the
increasing
VALUE
of
the
deliverables
to
the
customer.
– It’s
the
delivery
of
Customer
Value
that
inverts
the
management
process,
and
focuses
on
Keeping
the
Program
GREEN
as
planned
that
maximizes
value
to
the
customer
(the
Government)
4
9.0
Framework
85. SETR
Program
Events
5
hdps://acc.dau.mil/docs/technicalreviews/dod_tech_reviews.htm
9.0
Framework
86. Build
PEs
Leg
to
Right
Start
with
SRR
(or
something
on
the
leA)
and
completely
define
its
compleDon,
before
moving
to
the
next
PE
! Define
the
SAs
for
an
Event
and
construct
a
work
flow
of
the
ac&vi&es
needed
to
sa&sfy
the
SA
– These
ac&vi&es
are
yet
tasks,
so
don’t
commit
too
soon
to
defining
the
detailed
work
– Isolate
the
SAs
by
event
first
–
only
work
on
one
event
at
a
&me
! Iden&fy
the
par&cipants
in
the
work
– What
IPTs
par&cipate
in
this
work?
– What
swim
lanes
are
needed
to
isolate
the
IPTs?
! Define
the
elements
– Ac&vi&es
performed
to
sa&sfy
the
SA
– Deliverables
that
result
from
these
ac&vi&es
! There
are
s&ll
not
Accomplishment
Criteria
(AC),
but
that
comes
next
6
9.0
Framework
87. The
Accomplishment
Criteria
(AC)
! A
defini&ve
measure
or
indicator
that
verifies
comple&on
of
work
for
the
accomplishment
– Completed
work
effort
• Manufacturing
Plan
Completed
– Confirma&on
of
performance
compliance
• Flight
Test
Report
Approved
– Incremental
verifica&on
• Maintenance
Demonstra&on
Completed
– Completed
cri&cal
process
ac&vi&es
• Risk
Management
Plan
Approved
7
9.0
Framework
88. The
Accomplishment
Criteria
(AC)
! Defines
the
measure
by
which
an
Accomplishment
(SA)
is
considered
“done”
! Terms
like
complete,
delivered,
closed
have
no
“units
of
measure”
in
the
context
of
a
Significant
Accomplishment
(SA)
and
are
open
to
interpreta&on
! Terms
like
…
– Measures
of
comple&on
–
80%
of
drawings
approved
for
release
– Counts
of
available
items
–
75%
of
pin-‐outs
assigned
voltage
– Fidelity
of
a
design
–
outer
mold
line
defined
within
90%
of
target
– Error
bounds
–
spacecraA
mass
known
to
±20%
– Performance
parameters
–
disconnect
force
within
allowed
limits
– Maturity
parameters
–
flight
arDcle
successful
in
last
3
tests
…
are
used
to
define
the
“exit
criteria”
8
9.0
Framework
89. 2
Types
of
Accomplishment
Criteria:
Entry
and
Exit
! Entry
Criteria
–
Substan&ates
readiness
for
the
review
! Exit
Criteria
–
Substan&ates
successful
comple&on
of
the
review
! Cri&cal
Design
Review
(CDR)
example
– Are
we
ready
for
the
Flight
Test
Readiness
Review?
– How
do
we
know
the
FTRR
a
success?
– What
did
we
learn
from
the
FTRR
that
increases
the
maturity
of
the
program’s
deliverables.
9
9.0
Framework
90. The
IMP
Focuses
us
on
Measures
of
Effec&veness
and
Performance
10
MoE
KPP
MoP
TPM
Mission
Need
Acquirer
Defines
the
Needs
and
Capabili&es
in
terms
of
Opera&onal
Scenarios
Supplier
Defines
Physical
Solu&ons
that
meet
the
needs
of
the
Stakeholders
Opera3onal
measures
of
success
related
to
the
achievement
of
the
mission
or
opera3onal
objec3ve
being
evaluated.
Measures
that
characterize
physical
or
func3onal
aAributes
rela3ng
to
the
system
opera3on.
Measures
used
to
assess
design
progress,
compliance
to
performance
requirements,
and
technical
risks.
9.0
Framework
91. Measure
of
Effec&veness
(MoE)
! Measures
of
Effec&veness
…
! Are
stated
in
units
meaningful
to
the
buyer,
! Focus
on
capabili&es
independent
of
any
technical
implementa&on,
! Are
connected
to
the
mission
success.
The
opera&onal
measures
of
success
that
are
closely
related
to
the
achievements
of
the
mission
or
opera&onal
objec&ves
evaluated
in
the
opera&onal
environment,
under
a
specific
set
of
condi&ons.
“Technical
Measurement,”
INCOSE–TP–2003–020–01
MoE’s
Belong
to
the
End
User
11
9.0
Framework
92. Measure
of
Performance
(MoP)
! Measures
of
Performance
are
…
! Adributes
that
assure
the
system
has
the
capability
to
perform,
! Assessment
of
the
system
to
assure
it
meets
design
requirements
to
sa&sfy
the
MoE.
Measures
that
characterize
physical
or
func&onal
adributes
rela&ng
to
the
system
opera&on,
measured
or
es&mated
under
specific
condi&ons.
“Technical
Measurement,”
INCOSE–TP–2003–020–01
MoP’s
belong
to
the
Program
–
Developed
by
the
Systems
Engineer,
Measured
By
CAMs,
and
Analyzed
by
PP&C
12
9.0
Framework
93. Key
Performance
Parameters
(KPP)
Both
JROC
and
Program
Specific
! Key
Performance
Parameters
…
! Have
a
threshold
or
objec&ve
value,
! Characterize
the
major
drivers
of
performance,
! Are
considered
Cri&cal
to
Customer
(CTC).
Represent
the
capabili&es
and
characteris&cs
so
significant
that
failure
to
meet
them
can
be
cause
for
reevalua&on,
reassessing,
or
termina&on
of
the
program
“Technical
Measurement,”
INCOSE–TP–2003–020–01
The
acquirer
defines
the
KPPs
during
the
opera&onal
concept
development
–
KPPs
say
what
DONE
looks
like
13
9.0
Framework
94.
Technical
Performance
Measures
(TPM)
for
key
deliverables
! Technical
Performance
Measures
…
! Assess
design
progress,
! Define
compliance
to
performance
requirements,
! Iden&fy
technical
risk,
! Are
limited
to
cri&cal
thresholds,
! Include
projected
performance.
“Technical
Measurement,”
INCOSE–TP–2003–020–01
Adributes
that
determine
how
well
a
system
or
system
element
is
sa&sfying
or
expected
to
sa&sfy
a
technical
requirement
or
goal
14
9.0
Framework
95. What
are
Technical
Performance
Measures
Really?
! TPMs
are
measures
of
the
system
technical
performance
that
have
been
chosen
because
they
are
indicators
of
system
success.
They
are
based
on
the
driving
requirements
or
technical
parameters
of
high
risk
or
significance
-‐
e.g.,
mass,
power
or
data
rate.
! TPMs
are
analogous
to
the
programma&c
measures
of
expected
total
cost
or
es&mated
&me-‐to-‐comple&on.
There
is
a
required
performance,
a
current
best
es&mate,
and
a
trend
line.
! Actual
versus
planned
progress
of
TPMs
are
tracked
so
the
systems
engineer
or
project
manager
can
assess
progress
and
the
risk
associated
with
each
TPM.
! The
final,
delivered
system
value
can
be
es&mated
by
extending
the
TPM
trend
line
and
using
the
recommended
con&ngency
values
for
each
project
phase.
! The
project
life
trend-‐to-‐date,
current
value,
and
forecast
of
all
TPMs
are
reviewed
periodically
(typically
monthly)
and
at
all
major
milestone
reviews.
15
9.0
Framework
96. ! Tracking
TPMs
and
comparing
them
to
the
resource
growth
provides
an
early
warning
system
to
detect
deficiencies
or
excesses
! Reserve
alloca&ons
narrow
as
design
proceeds
! TPMs
that
violate
reserve
alloca&ons
or
have
trends
that
do
not
meet
the
final
performance
trigger
correc&ve
ac&ons
16
Tracking
the
Technical
Performance
Measures
9.0
Framework
98. Sample
IMP:
A
Flight
Avionics
System
(Con&nued)
Hardware
PDR
–
Purpose
! Ensure
system
hardware
ini&al
design
has
been
updated,
and
meets
func&onal
and
allocated
performance
requirements
within
program
constraints.
! Opera&onal
security
concept
assessed.
! Ensure
training
requirements
have
been
analyzed
and
their
objec&ves
have
been
defined
for
training
missions.
! Confirma&on
training
objec&ves
and
MTC
design
and
integra&on
conform
to
the
Air
Force
syllabus
and
Ready
Aircrew
Program
(RAP).
! Training
plan
will
be
updated.
! Hardware
PDR
–
Expecta&ons
! Team
agrees
system
hardware
ini&al
design
has
been
updated
and
can
proceed
to
the
detailed
design
phase.
! Team
agrees
training
plans
and
objec&ves
correlate
with
the
Air
Force
syllabus,
RAP
and
training
planning,
development
can
con&nue.
! System
Specifica&on
and
TTL
requirements
are
traceable
to
the
allocated
hardware
design.
18
9.0
Framework
99. Sample
IMP:
A
Flight
Avionics
System
(Con&nued)
! Hardware
PDR
–
Entry
Criteria
! Func&onal
Baseline
Authen&cated
(FBA)
! Ini&ate
system
hardware
ini&al
design
and
allocate
func&ons
to
the
appropriate
Configura&on
Items
! All
specifica&ons
updated
and
required
documenta&on
is
made
available
including
an&cipated
lower
level
design
documenta&on
! All
SRR/SFR
ac&on
items
closed
or
disposi&oned
19
9.0
Framework
100. Sample
IMP:
A
Flight
Avionics
System
(Con&nued)
Hardware
PDR
–
Accomplishments
! System
hardware
ini&al
design
complete.
– Func&ons
allocated
to
one
or
more
hardware
configura&on
items
and
are
traceable
to
the
MTC
SSS
and
TSSC
SSS.
– Human,
safety,
R&M,
EMI,
opera&onal
security,
instructor
and
operator
interfaces,
etc,
design
factors
have
been
reviewed.
! Drag
instructor
and
operator
manuals
reviewed
! Program
risks
updated,
assessed,
and
reviewed.
– Mi&ga&on
plans
in
place.
! Program
schedule
and
constraints
updated
and
reviewed.
– Cri&cal
schedule
path
drivers
reviewed.
! Design
criteria
for
the
simula&on
and
database
development
reviewed
and
updated.
! Program
processes
and
metrics
reviewed.
! Test
Planning
ac&vi&es
and
relevant
documenta&on
reviewed
by
test
team.
20
9.0
Framework
101. Sample
IMP:
A
Flight
Avionics
System
(Concluded)
Hardware
PDR
–
Exit
Criteria
! Hardware
(ownership,
visual,
IOS,
brief/debrief)
design
reviewed,
allocated
to
a
hardware
configura&on
item
and
updated
to
include
instructor
and
operators
interfaces,
malfunc&on
and
control
requirements,
etc.
! RTM
updated,
MTC
SSS,
TSSC
SSS
and
TTL
traceable
to
allocated
hardware
design
to
include
ESOH
requirements.
! Human,
safety,
R&M,
EMI,
opera&onal
security,
instructor
and
operator
interfaces,
etc,
design
factors
reviewed.
! MTC
and
TSSC
allocated
baselines
established
and
controlled
by
appropriate
level
documenta&on
for
PDR.
! Drag
instructor
and
operator
manuals
reviewed
with
user
concurrence
and
incorporate
sa&sfactory
human
factor
design
factors
into
the
operator
interfaces.
! Risk
management
and
mi&ga&on
plans
updated,
in
place,
addresses
ESOH
plans
and
risks,
and
within
program
constraints.
! Risks
assesses,
understood,
documented,
accepted
and
understood
by
team.
! Program
schedule
reviewed
– Cri&cal
path
drivers
iden&fied
– IMS
Updated
and
reflects
cri&cal
paths
21
9.0
Framework
102. One
More
IMP
Sample
! (SA)
System
&
Segment
Requirements
Updated
&
Allocated
– (AC)
SRR
/
SDR
Update
Review
Conducted
– (AC)
Preliminary
System
Specifica&on
Documents
(A011)
Baselined
– (AC)
Preliminary
Spacecrag
Segment
Specifica&on
Baselined
– (AC)
Preliminary
Ground
Segment
Specifica&on
Baselined
– (AC)
Preliminary
Specifica&on
Tree
Baselined
! (SA)
Preliminary
ICDs
Baselined
For
Customer
Review
– (AC)
Preliminary
Space-‐Ground
ICD
Baselined
! (SA)
PDR
System
Design
Completed
– (AC)
Top
Level
System
Architecture
Updated
– (AC)
PDR
Level
System
Analyses
Completed
– (AC)
PDR
Level
Reliability
/
Availability
Analysis
Completed
– (AC)
Preliminary
System
Level
Risk
Assessment
Completed
– (AC)
System
Level
Plans
Updated
For
PDR
– (AC)
Flight
Long
Lead
Review
Conducted
22
Preliminary
Design
Review
9.0
Framework
103. The
“But”
for
this
Guidance
! With
these
samples
and
the
SETR
guidance
we’ve
just
started
! The
program
needs
to
define
program
specific
events
to
assure
the
actual
maturity
measures
are
captured
– The
IMP
provides
sufficient
definiDon
to
track
the
step-‐by-‐step
compleDon
of
the
required
accomplishments
for
each
event
and
to
demonstrate
saDsfacDon
of
the
compleDon
criteria
for
each
accomplishment.
[AFMC
PAMPHLET
63-‐5]
23
9.0
Framework
104. IMP
Verbs
for
Significant
Accomplishments
24
Integrated
Master
Plan
Allowable
Verbs
Allocated:
Segment
requirement
is
flowed
down
from
the
System
Specifica&on
Released:
Approved
item
for
delivery
for
intended
customer
or
supplier;
all
internal
distribu&on
and
sign
offs
complete.
An
electronic
version
is
made
accessible
on
the
IDE
Completed:
The
subject
item,
data,
document,
or
process
is
prepared
or
concluded,
and
reviewed
and
accepted
by
the
responsible
IPT.
Suppor&ng
documenta&on
is
available
through
IDE
Reviewed:
The
subject
item,
data,
document,
or
process
is
prepared
or
concluded,
and
documented
for
comple&on.
Suppor&ng
documenta&on
is
available
through
IDE
Conducted:
The
subject
mee&ng
or
review
has
been
held
with
all
required
par&cipants.
The
charts
or
minutes
are
available
through
the
IDE
Updated:
The
subject
process,
data,
or
document
has
been
reevaluated
using
later
informa&on,
and
adjustments
incorporated.
Defined:
The
subject
configura&on
items,
data,
or
document
was
submided
to
the
customer
Validated:
Requirements
are
validated,
received
contractor
approvals,
were
distributed,
and
are
available
through
the
IDE
Established:
The
subject
items
is
created
and
set
in
place
in
a
manner
consistent
with
its
intended
use
ager
review
and
accepted
by
the
IPT
Verified:
Requirements
are
verified
or
processed
in
accordance
with
established
prac&ce.
9.0
Framework
105. The
IMP
Process
Narra&ve
! ObjecFve:
a
brief
statement
explaining
why
this
process
set
is
applied
for
this
program
! Governing
DocumentaFon:
lists
of
the
guidance
or
compliance
documents;
e.g.,
specifica&ons,
manuals,
and
procedures
including
company,
government,
and
industry
references
! Approach:
concise
descrip&on
as
to
who
owns
each
process;
what
are
the
roles
and
responsibili&es;
and
the
overall
process
including
a
process
flow
diagram.
25
9.0
Framework
106. Generic
IMP
Evalua&on
Criteria
! Do
the
Program
Events
and
Accomplishments
reflect
the
logical
evolu&on
and
progress
of
the
overall
Program?
– Do
program
events
and
their
defini&ons
clearly
demonstrate
the
maturity
of
the
program
over
its
life?
– Do
the
selected
Accomplishments
and
associated
Criteria
iden&fy
meaningful
and
measurable
progress
toward
the
key
goals
of
the
Program?
! Do
the
Accomplishments
for
each
event
demonstrate
a
meaningful
understanding
of
the
program
requirements
or
are
they
tasks
that
anyone
could
do
for
any
contract?
– Do
they
reflect
your
SOW
requirements?
! Does
the
IMP
structure
readily
map
to
the
IPT
structure
such
that
each
IPT
can
easily
visualize
the
scope
of
their
responsibility?
! When
awarded,
could
the
contractor
use
the
Accomplishments
as
discrete
ac&vity
cost
accounts
in
their
earned
value
system?
Or
are
they
level
of
effort
in
nature?
! Is
sufficient
visibility
provided
to
iden&fy
and
track
the
Program
Risk
Plan
and
associated
risk
mi&ga&on
accomplishments
and/or
con&ngencies?
– Does
criteria
suppor&ng
the
accomplishments
include
key
performance
requirements?
! Are
IPT
cross-‐dependencies
and
dependencies
external
to
the
Program
appropriately
reflected
if
they
reflect
poten&al
schedule
or
performance
risks
to
the
success
of
the
Program?
! Is
the
submided
"Contract
IMP"
(the
Product
IMP
that
is
to
be
included
as
part
of
the
Program
Contract)
defined
to
the
appropriate
level:
– Do
the
Accomplishments
and
associated
Criteria
go
down
to
a
level
sufficient
to
provide
visibility
into
key
subcontractor
ac&vi&es
upon
which
the
success
of
the
Program
may
be
dependent?
– Are
the
Events
and
Accomplishments
included
in
the
IMP
at
such
a
level
as
to
make
the
maintenance
of
the
'Contractual
IMP'
prac&cal
or
does
it
include
an
unnecessary
level
of
detail?
26
9.0
Framework
107. Program
Management
Levels
Program Levels" IMP/IMS Elements" CWBS"
Tier 1!
Program Manager!
Technical Leads!
IPT Manager!
Technical performance goals!
Major Program Events
(PE)!
!
!
!
Significant
Accomplishments (SA)!
Level 1 & 2!
Links to CLINs!
Level 3 & 4!
Control Packages!
Link to PBS!
Integrated EVMS!
Tier 2!
Control Account Managers !
Product Work Plan!
Responsible organization
elements!
!
Accomplishment
Criteria (AC)!
!
!
!
!
Tasks (NA)!
Level 5!
Cost Account
Package!
Cost Collection Level!
Links to WBS by OBS!
Resource summaries!
Early warning EVMS
analysis!
Tier 3!
Work Package Manager!
Detailed plans! Work package!
Earned value
calculations!
27
9.0
Framework
108. Connec&ng
the
Components
of
the
IMP/IMS
! The
assembled
IMP/IMS
links
all
work
ac&vi&es
ver&cally
to
the
ACs,
SAs,
and
PEs
28
IMS
Customer
Requirements
IMP
ORG
IPTs
ORG
IPTs
Performance
Analysis/
Management
Review
Events (E)
Events (E)
Accomplishments
Process
Narratives
Criteria
Integrated Master
Schedule (IMS)
Integrated Master
Schedule (IMS)
Control Account
Control Account
Work Package
Work Package
Work Package Tasks
Work Package Tasks
WBS
Program
Performance
Management
System
Supplemental Schedules
Risk and
opportunity
Risk and
opportunity
9.0
Framework