2. Vision
To become the most
preferred business
partner to our
customers through
leadership in our
actions, values and
social responsibility
People
Business
Mission
To be a world class
organization in enabling
clients to become
Leaders in their industry
Technology
Values
LEAD by Example
Leadership, Empower, Agile, Decisive
A Business, Technology and Talent Development Consulting
Company with focus on Healthcare , Retail & IT
2
9. Enter Agile
some
Agile principles:
Value for the customer as early as possible
Eliminate Waste (WIP, YAGNI)
Drive and Respond to Change, quickly
Time/Capacity Boxing (see Scrum and Kanban)
Provide Visibility into project progress
Create
9
10. Why do Metrics matter
REASONS #1 & #2
•
We Have to!
•
Self Defense!
Every stakeholder wants to know what’s going
on through a quantified measure!
10
11. Why do Metrics Matter?
Reason #3
To Make Business Decisions
•
Decision making frequency increases multi-fold
•
Such as
- Should we start this effort
- Which team needs the most help now
- When do we stop doing this product backlog
- Do we understand the customer better
- Did it actually help to remove that
- impediment
11
12. Why do Metrics Matter?
Reason #4
•
To get feedback, so that forward-looking guesses have a higher
probability of being right
•
We make a guess (aka estimate), and then we check later how
good the guess was
•
If it is off a lot...maybe: ”gee, we need to learn how to estimate
better”
12
13. Why do Metrics Matter?
Reasons #5
•
To change behavior...
– Not just the key business-decisions
– But as close as possible to all the behavior on a day-to-day
basis
13
14. What you can’t control you can’t manage!
What you can’t manage you can’t measure!
14
28. The Team wants metrics. Why?
•
To help them see their work
•
To plan with
•
To determine when successful
•
To push back on magical-thinking managers
•
To challenge themselves
28
29. Some key attitudes
•
We accept that things always were and always will be imperfect
•
We relentlessly pursue perfection
29
32. The Agile approach
•
Truth and transparency are essential
•
The metrics are first for the Team
•
Typically, we trust the Team
32
33. The Agile approach - 2
•
Managers can visit a team at any time to see the meaning of
any numbers
•
Managers have the patience and respect to observe the
Gemba
33
35. Good Metrics
1. Be accurate enough to enable better
decision making
2. Enable better actions and serious
Measure
Results not
Output
improvement
Vital Few
3. Not be seriously gamed (inaccurate);
ideally - “gaming” is actually better
behavior
Measure
Trends
Easy to
Collect
4. Change the behavior of all members of
team and related managers
Reinforce
Desired
Behavior
5. Motivate the team (or at least not deAmplify
Learning
motivate)
6. Simple enough that they are done, and
used well
7. Enable optimizing the whole
35
37. Metrics Perspectives from the Trenches
“I once worked for a team that started doing TDD, and we decided
to measure the number of unit tests written during a sprint. The
metric reminded us of the fact that we wanted to write tests, and it
provided opportunities for celebrating as a team that we were
implementing our decision. We weren't judged by outside the team
on the metric, though, we didn't try to maximize it, and as soon as
writing tests became an integral part of our work, we abandoned
the metric, because it wasn't useful anymore”
37
38. Metrics should not scare or threaten people
Enforced metrics are often cheated or ignored
38
39. Agile Metrics
•
Velocity
•
% Change in Velocity since
(inception, last year)
•
Stories Completed (done, done)
•
Number of Passing Unit and
Functional Tests (today or with
growth trend)
•
Number of story points
completed to date; % of total.
•
Bugs that escaped the Sprint
•
Bugs open today
•
•
% BV completed (if use BV
points or similar)
Oldest bug open (with Sev
level)
•
Sprints with stories incomplete
•
Sprints with added stories
•
Unplanned tasks (in the X
Sprint); related hours
•
•
Full Product Backlog (remaining
stories)
Impediments Open Vs Resolved
39
40. Agile Metrics – More
•
Stories added to / subtracted from the
Release
•
Age of each story to done, done;
average age
•
Impediments removed to date
•
Defects identified after done, done
•
If start with big bug list
–
–
–
Builds that passed/failed initially, to
date
•
•
Defects identified after release
•
Bugs added (old features) (per time)
Old Bugs resolved / closed (per time)
Old Bugs remaining (over time)
If starting with minimal automated
tests
–
–
–
Number of automated tests (unit,
functional, etc)
Number of manual tests (that could be
automated)
Effort on manual testing
•
Metrics around quality of builds and
regression tests
•
Metrics around quality of code (eg,
cyclomatic complexity)
•
Code coverage by automated tests
(unit, functional, etc.)
40
41. Some Kanban Specific Metrics
lead time
cycle time
40
60
25
ouch!
ouch!
Cycle Time =
Number of Things in Process/
Average Completion Rate
Little’s Law
time spent in each lane ?
bottlenecks ?
Flow = Speed * Density,
Density Speed
=> Traffic Jam
41
43. What Metrics works for you is important!
Case Study 1
•
Rework Ratio
•
Defects closed v/s resource capacity
•
New defects injected (open + closed – monthly iterations)
•
Defect per dev per week (Injected)
•
Review Yield = (Defects captured in Review)/(Total defects
captured in review + testing)
•
RCA (for client defects and returnrework defects from iterations)
and corrective actions
•
Avg. Story points (expected v/s actual)
43
44. What Metrics works for you is important!
Case Study 2
• At completion of sprint, sprint status(g/y/r) and overall project
health(g/y/r)
– The above measure is dependent on the # of US planned
versus done; if US not ‘done’, its not accounted in the
completed US
– Project health depends not only on US done, but on cross
functional dependencies(internal + external) and risks status. If
any of those aren’t Green, project health is Y or R.
• Bug status reported at sprint completion
• Stories Progress - Hours completed vs remaining
• Stories Progress - how much work remains on each story, whats
the progress towards completing the work on each story
44
45. What Metrics works for you is important!
Case Study 3
• Project:
– Burn-downs for Sprints
– Burn-ups for Releases (it provides greater visibility into the
value that can be derived at a certain point in time)
– Defect Arrival and Kill Rates
– Team velocity (with an eye on variance over time. Reducing
variance are signs of a stabilizing system)
• Program/Portfolio:
– Cycle times
– Lead times
– Throughput
– Wait times and more..
45
46. What Metrics works for you is important!
Case Study 4
Sprint Level
•
Velocity
•
Wastage in hours/Sprint
•
Service Test Automation Pass rate (Program)
•
Regression Pass Rate (Team wise & Program)
•
User Story - DOD Completeness
•
Code Coverage
Program Level
•
Business Value Delivered vs Business Value Backlog
46
47. What Metrics works for you is important!
Case Study 5
•
% Progress.
Acceptance Criteria Data.
User Acceptance Test : Accepted Data.
Rework data After UAT.
Test cases Executed.
Test Cases returned.
Velocity of sprint.
Actual Acceptance Planned Vs final accepted.
The above which in turn will get the quality of the
sprints.
Program Plan levels:
–
–
–
Number of Stories Mapped to each release/Sprint.
Number of Stories for total release.
Probable cost per sprint based on a Projected
Acceptance criteria.
Line Management perspective.
–
–
Release/ Sprint Level:
–
–
–
–
–
–
–
–
–
•
•
–
–
–
•
From Customer Validation:
–
–
–
–
–
•
Either use Effort in Time for each Story.
In depth Sub task level effort.
Team Level:
–
•
CSI.
% of Progress in each release.
Cost per sprint / Cost per Release.
Cost Burn down.
Value Burn up
Estimations
–
–
•
No: of Sprints Planned.
Total no: of Team members in reach release.
FTE’s & Consultants.
Total no: of Cross functional Team Members.
Future Forecast of Team members.
Ramp up data across sprints.
Happiness Factor
Large Scale Integration Projects:
–
Integration Hand-offs
47
50. Extended Metrics
Value
Customer
Surveys
NPS
Burn Up/
Burn Down
NPV/ ROI
Continuous
Improvement
Work-inProgress
Collaboration
Cost per
Sprint/
Point
Velocity
Real Value
Delivered
Team
Surveys
Story Cycle
Time
Predictability
Technical
Debt
RTF/
Automated
Tests
Defects
Quality
50