SlideShare utilise les cookies pour améliorer les fonctionnalités et les performances, et également pour vous montrer des publicités pertinentes. Si vous continuez à naviguer sur ce site, vous acceptez l’utilisation de cookies. Consultez nos Conditions d’utilisation et notre Politique de confidentialité.
SlideShare utilise les cookies pour améliorer les fonctionnalités et les performances, et également pour vous montrer des publicités pertinentes. Si vous continuez à naviguer sur ce site, vous acceptez l’utilisation de cookies. Consultez notre Politique de confidentialité et nos Conditions d’utilisation pour en savoir plus.
Design’s value is in these 4 activities
• Driving Understanding & Empathy
• Creating Clarity & Behavioral Fit
• Exploring the possible & desirable
• Envisioning experiences
Observing, interviewing, & learning
with the people for whom we will be
Synthesizing actionable insights to
direct teams to design the right things.
Providing the right structures to help
people achieve value.
Present words & images to maximize
efficient communication of meaning &
Fit the behavior of the system(s) to the
behaviors of the humans who use them.
Possible & Desirable
Validation is not an end point,
but a beginning point.
Association of a multiplicity of
ideas is at the heart of
Tell stories with words &
images to convey insights.
Using narratives to express
the human possibilities of a
Some skills to create that value
• Visual Thinking
• Information Structure, Navigation,
• Activity sequencing and modeling
• Workshop Facilitation
Different types of metrics
•Qualitative data, quantified.
•Gathered through automated instrumentation
• Moment in time
Cascading use of data
Metric What is available to measure that relates to our hypothesis?
If we compare the original metric to another metric can
that help understand the original hypothesis?
What does the correlation tell us?
Which directions make the desired eﬀect?
What measure of the metric will tell us we reached an
otherwise qualitative goal?
How do the prime metric and correlated metric compare over time?
How strong of a correlation in the trend would be signiﬁcant? (diﬀerential)
What can we map against a timeline to help us understand and interpret
possible moments of cause and eﬀect? (such as releases, ship dates)
The value of a metric or the combination of metrics at
beginning of any initiative.
Hypothesis What we want to learn from our data?
Setting up your measurement*
*extrapolated from a case study by Intuit
Data Type: Quantitative
Collected by: Self-reporting
Using tool such as Harvest
Increase design quality &/or
Increasing time designing will
increase design quality &/or
Measuring Desired Outcomes:
NPS, Heuristics, Usability
Correlations are not answers.
They are signals.
For Correlations to work …
… The two metrics need to be plotted on the
… The correlation itself needs to be along a
continuous and steep grade.
… Major exceptions to the correlation over
time need to be followed up on thoroughly.
How can you assess quality?
Is the product organization aligned in their understanding of the
value of you design(ing) to the business & their customers? 2
1. There is no alignment across the product organization.
2. There have been gains in alignment seen by open trials of design and
research activities and processes.
3. Alignment is growing, as seen by more non-designers participating in design
4. Design value is well understood and consistently articulated across the
Top 3-5 metrics that tell you
something might be wrong,
or everything is ok.
• Number of UX stories
that started in a sprint’s
backlog, but didn’t get
deployed to prouction.
• Attrition rate within a
design team compared to
the whole organization.
• Time spent designing/
•Is recruitment leading to
best in class talent hired?
•Is the team engaged and
•Are the values being upheld?
•Are diversity & inclusion
upheld as important values?
• Teams are meeting the needs of
• How much of the total design process are
team members being encouraged &/or
allowed to do?
• Are designs regularly being included in
• Does the team have line of sight into the
team & business?
• Is the signal:noise ratio being managed?
• Is tribal knowledge available easily?
•Is the team able to get the
tools they need to be
successful & productive?
•Are tools easily integrated
to each other, and to the
broader set of
• Are the mission & vision in place and well
• Are the team’s principles being used to
evaluate the quality of design work?
• Are decision making processes better
understood, and acted upon?
• Is the DesignOps team creating and
maintaining relationships with key
BusinessOps teams to ensure DxD smooth
How do you know if you are
measuring the right things?
Getting to the right design
Imagining possible futures
Getting the design right
Impact of Agile
• Research, interpretations, synthesis, insights,
empathy are not part of most agile processes.
• Deliverable as “working software” doesn’t
encourage exploring and creating in the abstract.
• Single cadence for entire team doesn’t consider
that design & research practices works differently