This document discusses measuring quality for JIRA Cloud releases. It begins with principles for metrics, including starting with questions to answer, collecting metrics to drive decisions rather than as an end, and being willing to discard metrics. It then discusses context around JIRA Cloud and challenges in measuring its quality. Specific metrics proposed include number of incidents and support cases per release. The document advocates learning from measurements by focusing on prevention initiatives rather than just root causes. It emphasizes continuous improvement through metrics.
%in kempton park+277-882-255-28 abortion pills for sale in kempton park
How we measure quality of JIRA deployments to Cloud?
1. Measuring Quality of JIRA Cloud
Michał Kujałowicz, JIRA QA Team Lead
michal.kujalowicz@spartez.com
2. • Involved in development of core Atlassian products
• Commercial, Open-Source add-ons, customisations
• Developers, Product Managers, Designers, QAs, Agile
Team Leads
• QA for Quality Assistance
Partner
3. • 1 QA per 10 developers
• Testing in hands of
developers (both manual and
automatic)
• Teaching/coaching to
become great testers
Quality Assistance
4. • Process improvement
• Test tools, test environments
• Customer insight
• Prevention, data-driven
decisions
• Defect analysis, continuous
improvement
Those other things
8. Software Quality
is the degree to which software possesses a
desired combination of attributes (e.g. reliability,
interoperability) [IEEE 1061].
refers to two distinct notions:
- software functional quality
- software structural quality [wikipedia]
10. Measurements
• It’s not about counting things
• It’s about estimating the value of something
http://kaner.com/pdfs/SoftwareRelatedMeasurements.pdf
11. Measurements
• the thing you want to measure.
• the thing you use to take a
measurement
• what the instrument tells you
when you use it to measure
something
• is the READING
• the function that assigns a value
to the attribute, based on the
reading
http://kaner.com/pdfs/SoftwareRelatedMeasurements.pdf
ATTRIBUTE
INSTRUMENT
READING
MEASUREMENT
METRIC
23. JIRA Cloud contd.
• Incident management process
• Regressions with the biggest impact
• Thousands of issues reported in official Bug
tracker per year
27. Start with questions you need answers for
• What is the quality of our releases to JIRA
Cloud? (Functional + Perceived by customers)
• What bugs are we letting through?
• What is their severity?
• What are the root causes of those
bugs?
• How can we prevent them?
Principles in action
28. • Goal: Being able to provide facts about
quality of releases to JIRA Cloud
• Goal: Being able to take further decisions
Collecting metric is not the goal
Principles in action
Identification Metrics!
29. Metrics/Measures
Number of incidents per
each release
Average number of Support
Cases per incident for each
release
Metric:
Metric:
Incident = Every failure
causing more than 4 Support
Cases
30. Customer feedback
Support JIRA
instance
Official bugtracker
BUG-12345SUPPORT-7893
Customer
Problem report
Problem report
Incident instance
!!!
Incident!
Bug report
Incident management
process
DEV, SOPS
32. Measurement system -
part 1
Support JIRA
instance
Official bugtrackerIncident instance
DB
Get all links and store number of
Support Cases for every bug
33. Measurement system -
part 2
Support JIRA
instance
Official bugtrackerIncident instance
DB
Get all links and store number of
Support Cases for every bug
Internal JIRA
instance Defect
analysis
scripts
34. Measurement system -
part 2
Get all
issues
reported in
bug tracker
yesterday
Check
support
cases
number in DB
More than
2 support
cases?
Create or
update issue
in internal
JIRA
Repeat process
for issues
created 3, 7 and
14 days ago
X
37. • Some incidents do not have Support
Cases
• Is not normalised against number of
customers
• Difficulty in driving severity imagination
• Influenced by time of fixing (good or bad?)
Metrics will not be perfect!
Principles in action
38. Do we have problem with
quality?
It is less than
0.2 % of
customers
You cannot
prevent all
bugs
It is significant
number
Even 1
Support Case
is too much
0.2% reported,
how many
could not
work?
40. • Improve Post Incident Review process
• Create Tech Debt team
• Kick off several initiatives/projects for
better prevention
Drive the decisions
Principles in action
• Improve Post Incident Review process
41. • Ticket
• Severity
• Document Owner
• Report Status
• Executive Summary
• Do we know the root cause?
• Has the root cause been mitigated?
• Root Cause
• Outage Description
• Was this a repeat of a previous incident?
• Affected users
• Start Date/Time (UTC)
• End Date/Time (UTC)
• Duration
• Time to Detection
• Time to Recovery
• What went well?
PIR
• What went well?
• What could have gone better?
• Where we got Lucky?
• Priority actions to fix root cause(s)
• Actions to improve service quality and/
or mitigate risk
• Names of people involved
• Approvers
42. PIR Improvements
• Focus on prevention action items
• Created in team’s backlogs
• Measurable and Trackable
• SLA for development teams to fix those
issues
43. Action Items
• Do not fix the
root cause only
• Try to prevent
whole class of
issues
46. Improvement metrics
• % of completed action items
• Number of incidents and support cases per
release
• Number of incidents and support cases per
released change
47. Number of incidents and support
cases per released change
• LOC or tasks (issues)?
48. There is more to measuring
quality• Customer value
for new features
- analytics,
feature specific
• Usability
• Performance
• Availability
• Security
50. Metrics’ principles
• Start with questions you need
answers for
• Collecting metric is not the goal
• Buy in for goals and metrics
• Metrics will not be perfect
• Drive the decisions
• Do not be afraid to throw them
away
51. Measurement system
• You need facts
• Not easy to build but
this is not an excuse
• Use your opportunities
• Automate
52. Learning from defects
• Focus on follow-up items
not descriptions
• Do not fix the root cause
only, try to prevent
classes of issues
• Stating: We need to be
better in … next time - will
not work
53. QA Role
• Yes, it belongs to
you!
• Drive
• Do not be afraid to
question existing
metrics
55. Images - credits
• Żuraw in Gdańsk - by JM_GD - CC BY 2.0
• Chess - by DGlodowska - CC0 Public Domain
• Land Rover Wolf XD - by Bob Bob - CC BY 2.0
• Cress keyboard - by wetwebwork - cc-by-sa-2.0
• definition - by PDPics - CC0 Public Domain
• ‘Quality ... is like buying oats. signage’ by antefixus U.E. -
CC BY-NC-ND 2.0
• ‘Customers Needed NO Experience Required’ - Matthew
Burpee - CC BY-NC-SA 2.0
• Eintracht hooligans - by Heptagon - CC BY-SA 3.0
• ‘iceland sources’ - by Barthwo - CC0 Public Domain
• Countryliner - by Arriva436 - CC BY-SA 3.0
• ‘McRae Fire. Low-Severity Fire.’ - Kaibab National Forest -
by CC BY-SA 2.0
• The slug in the water - by Daniel Mietchen - CC BY-SA 2.0
• TasmanianDevil 1888 - by Mike Switzerland - CC BY-SA 3.0
• ‘Typhoid Dragon, Slain by Prevention’ - by VCU Tompkins-McCaw Library Special
Collections - CC BY-NC 2.0
• Raise_your_hand_if_you_can’t_swim - by National Photo Company - CC0 1.0
• ‘Signs of the Times: If Anything is Not to your Satisfaction.... (pingnews)’ - by pingnews.com
- CC BY-SA 2.0
• Crowd - by James Cridland - CC BY 2.0
• whole world in my hands - by sewingstars - CC BY-NC-ND 3.0
• ShippingContainerSFBay - by Mgunn - CC0 Public Domain
• Anger Controlls Him - Jessica Flavin - CC BY 2.0
• houses - by OpenClipartVectors - CC0 Public Domain
• Under Floor Cable Runs Ell - by Robert.Harker - CC BY-SA 3.0
• Angry... ? - by Navaneeth KN - CC BY 2.0