In a combined presentation with Telstra, we put a unique, fresh and evidence-based approach to the often-controversial topic – qual or quant? We will definitively demonstrate how linking quantitative with qualitative techniques can significantly improve the ability to understand customers – and consequently design services to meet these needs, improve experiences, and ultimately measure success.
Leveraging business intelligence with service design frameworks
Most companies collect a large amount of data in the form of customer feedback, but due to the structure and size it is often underutilised. Let us show you how we created a service framework using this information for Telstra – one that tests the end-to-end customer experience by aligning both quantitative and qualitative research, the best of both worlds! See the techniques we applied, as well as how the framework for Telstra’s products and services relates to service design and testing.
This service framework has provided a better, more holistic service experience for customers. The feedback from our qualitative counterparts has been amazing; it has revolutionised the way they do UX and CX research. Not only do they use it as a tool to understand existing service environments, they can now prioritise findings on key user and customer experiences that have the biggest impact in driving changes and improvements.
Instead of just relying on a small sample of information to make a conclusion about a market or experience, researchers now have the added value of quantitative information to gain further credibility with stakeholders – and ultimately drive better business outcomes.
We hope that our presentation will help you take away what we have learned, and what strategies we recommend, to maximise outcomes for your business too.
Thane Call Girls 7091864438 Call Girls in Thane Escort service book now -
A bridge between two worlds – where qual and quant meet: Slides from UX Australia 2015
1. Rosie Handby – Head of Analytics & Insights, U1Group
Renee Farnham – Strategy Lead, The Design Practice, Telstra
two worlds
A bridge between
2. Telstra and U1 Collaboration
People make the
world go round
Many minds are
better than one
Optimism fuels
creativity
Failure improves
learning
We believe…
Strategy Lead at Telstra’s Design Practice
Over ten years working in different industries embedding customer insights in to
business strategy.
Design Practice is an exciting place to be! We are turning Telstra into an experience led
innovation company. We are passionate about making our Telstra customer's home and
work lives better and are constantly taking customer centred design the next level.
Head of Analytics and Insights at U1 Group.
15+ years’ experience in quantitative research and statistics.
U1 is a user and customer experience research company providing expert advice and
insight. I head up an exciting new area in U1 called Analytics & Insights. We are
bringing leading edge quantitative research and data analysis (not just web analytics!)
into our user research to make data-driven evidence based conclusions at both
strategical and tactical levels.
3. Identifying the need
Consistent, quantifiable and measurable and ...
Can be easily communicated across multiple business units.
Define what a positive service experience means for the customer in
a testable way that is ...
Testing is often limited to “sunny day scenarios”
No tangible understanding of success
Difficult to prioritise research findings
Lost in translation across business lines
4. The Two Worlds
Quantitative Research
Customer Feedback
Operational Metrics
Tracking Studies
Market Research
Product Lifecycle
Qualitative Research
Customer and User
Experience
Strategy & Idea
development,
Concept Testing
Defining core
principals,
journey mapping,
service maps
Design and
Prototyping
User Testing
and Service
Walkthroughs
Post Launch
Review
5. S E R V I C E
D R I V E R
F R A M E W O R K
Quantitative
research
Qualitative
research
7. What is the Service Driver Framework?
Identifies the most significant
aspects of a customer service
experience that have a
positive or negative impact on
the customer.
8. Example Service Driver Framework
Theme Service Driver Specific experiences
Reliability &
Trust
Resolution Proactive resolution of issues
Keeping Promises Keeping technician appointments, following up
on call backs, etc…
Efficiency &
Simplicity
Activation Timeframe Overall time taken to activate product.
Simplicity/Customer Effort Simple process to follow for customer
Communication Fully informed Keeping customer informed ongoing
Staff Knowledge Knowledge conveyed to customers about
product/services
Staff Attitude Well Mannered Polite, friendly, helpful, courteous, etc...
9. Creation of the Service Driver Framework
Workshops and in-depth interviews
to understand internal processes
and external experiences across
multiple business units.
Type of customer and experience?
Available information and data
sources?
Data analysis methodology?
Consistent and easily understood
terminology
14. How do we use Service Driver Framework?
Benchmarking and tracking
customer feedback and metrics
to monitor changes and identify
success.
Advocacy impact assessment
and gap analysis to identify and
prioritise risks to customer
experience.
Prioritise results from user and
customer experience research
and testing.
15. Practical Application
New process for installing a consumer service.
Objectives
Understand the risk to the customer experience upon implementation of
consumer service.
Identify the most significant technical or sales process changes that could
potentially impact on advocacy upon launch.
Calculate an advocacy risk score to prioritise the risks.
Advocacy Impact Assessment
16. Practical Application
Service Driver
Quantitative Qualitative Risk Assessment
NPS feedback
(% negative
comments)
Operational
Metrics
Service
Walk-
through
&Trials
Technical &
Sales Process
changes
Severity of
impact to
customer
Likelihood
to occur
Advocacy
Risk Score
Resolution X% % customers with
unresolved issues
Related
issues
Related
issues
High Medium High
Keeping
Promises
X% % missed
appointments
Related
issues
Related
issues
Medium Medium High
Activation
Timeframe
X% Ave activation
timeframe
Related
issues
Related
issues
High High Very High
Our approach
1. Review quant and qual research
2. Map to service driver framework
3. Identify key risks to customer experience
4. Track quant metrics to determine success
Advocacy Impact Assessment
17. Severity of
impact to
customers
Customer effort
Keeping promises
Resolution
Staff attitude
Activation timeframe
Likelihood of Occurrence
Fully informed
Simple & clear language
Practical Application
Outcomes
Advocacy Impact Assessment
18. A Designer’s Perspective
“In discovery we use this to help design our research materials and
interview scripts.”
“These drivers become quick ways for us to elicit structured data from
our stakeholders”
“When we are defining the problem we are trying to solve, we use this as a
catalyst for thinking during stakeholder workshops.”
“We use this as an existing structure in which to cut down the amount of
time we spend doing affinity analyses”
20. Issue Solution
1) Prioritisation of research
findings.
2) Testing in controlled, “sunny
day” environment.
3) No tangible understanding
of success.
4) Results lost in translation
1) Match research findings to
framework to determine
order of priority.
2) Identification of most
significant issues in current
“rainy day” environment.
3) Pre and Post review of
testing.
4) Shared language
Addressing the needs
S E R V I C E
D R I V E R
F R A M E W O R K
21. What have we learned?
Linking quantitative with qualitative techniques can significantly improve
the ability to understand customers holistically
We can understand our customer needs better than ever. We can improve
experiences, and ultimately measure success
We have created a tool to understand existing service environments and
prioritise findings on customer experiences that have the biggest potential
to drive change
22. CONTACT US
RENEE FARNHAM
Strategy Lead, The Design Practice, Telstra
Renee.Farnham@team.telstra.com.au |
03 8649 5283
ROSIE HANDBY
Head of Analytics & Insights, U1 Group
rosie@u1group.com | 03 9684 3470
Notes de l'éditeur
Telstra measures customer experience by tracking feedback and coming up with a Net Promoter Score (NPS).
The score alone does not give enough information from a product design and build perspective.
We needed to dig deeper into the feedback to come up with specific areas that impact the customer and measure these – hence the SDF!