SlideShare une entreprise Scribd logo
1  sur  44
13 Easy split testing f***ups

1
@OptimiseOrDie
Top Fuckups for 2013
1. Testing in the wrong place
2. Your hypothesis inputs are crap
3. No analytics integration
4. Your test will finish after you die
5. Not testing for long enough
6. No QA for your split test
7. Opportunities are not prioritised
8. Testing cycles are too slow
9. Your test fails
10. The result is ‘about the same’
11. Test flips or moves around
12. Nobody ‘feels’ the test
13. You forgot you were responsive
@OptimiseOrDie
@OptimiseOrDie
• UX and Analytics (1999)
• User Centred Design (2001)
• Agile, Startups, No budget (2003)
• Funnel optimisation (2004)
• Multivariate & A/B (2005)
• Conversion Optimisation (2005)
• Persuasive Copywriting (2006)
• Joined Twitter (2007)
• Lean UX (2008)
• Holistic Optimisation (2009)

Was : Group eBusiness Manager, Belron
Now : Consulting
Timeline

- 1998

1999 - 2004

2004-2008

2008-2012

@OptimiseOrDie
#1 : You’re doing it in the wrong place

@OptimiseOrDie
#1 : You’re doing it in the wrong place
There are 4 areas a CRO expert always looks at:
1. Inbound attrition (medium, source, landing page, keyword,
intent and many more…)
2. Key conversion points (product, basket, registration)
3. Processes and steps (forms, logins, registration, checkout)
4. Layers of engagement (search, category, product, add)
1.
2.
3.
4.

Use visitor flow reports for attrition – very useful.
For key conversion points, look at loss rates & interactions
Processes and steps – look at funnels or make your own
Layers and engagement – make a model

Let’s look at an example I’ve used recently
@OptimiseOrDie
Examples – Concept
Bounce
Engage

Outcome

@OptimiseOrDie
Examples – Shoprush.com
Search or
Category
Product
Page
Add to
basket
View
basket
Checkout
Bounce
Complete

@OptimiseOrDie
Examples – 16-25Railcard.co.uk
Login to
Account
Content
Engage
Start
Application
Type and
Details
Eligibility
Bounce
Photo
Complete
@OptimiseOrDie
6.3 – Examples – Guide Dogs
Content
Engage
Donation
Pathway
Donation
Page
Starts
process
Funnel
steps

Bounce

Complete
@OptimiseOrDie
6.3 – Within a layer
Exit

Page 1

Page 3

Page 2

Page 4

Wishlist
Contact

Page 5

Email
Like

Deeper
Layer

@OptimiseOrDie

Micro
Conversions
#1 : You’re doing it in the wrong place
• Get to know the flow and loss (leaks) inbound, inside and
through key processes or conversion points.
• Once you know the key steps you’re losing people at and
how much traffic you have – make a money model.
• Let’s say 1,000 people see the page a month. Of those,
20% (200) convert to checkout.
• Estimate the influence your test can bring. How much
money or KPI improvement would a 10% lift in the
checkouts deliver?
• Congratulations – you’ve now built the worlds first IT
plan with a return on investment estimate attached!
• I’ll talk more about prioritising later – but a good real
world analogy for you to use:
@OptimiseOrDie
Think like a
store owner!
If you can’t refurbish
the entire store,
which floors or
departments will you
invest in optimising?
Wherever there is:

•
•
•
@OptimiseOrDie

Footfall
Low return
Opportunity
#2 : Your hypothesis inputs are all wrong

Insight - Inputs
Opinion

Cherished
notions

Marketing
whims

Cosmic rays

Not ‘on
brand’
enough

Ego

IT
inflexibility

Panic

Internal
company
needs

#FAIL

Competitor
change

An article
the CEO
read

Some
dumbass
consultant
Competitor
copying

Dice rolling

Guessing

Knee jerk
reactons

Shiny
feature
blindness

@OptimiseOrDie
#2 : These are the inputs you need…

Insight - Inputs
Usability
testing

Forms
analytics

Search
analytics

Voice of
Customer
Market
research

Eye tracking

Customer
contact

A/B and
MVT testing
Big &
unstructured
data

Insight

Social
analytics

Session
Replay

Web
analytics

Segmentation

Sales and
Call Centre

Surveys

Customer
services

Competitor
evals

@OptimiseOrDie
#2 : Solutions
• You need multiple tool inputs
– Tool decks are here : www.slideshare.net/sullivac

• Usability testing and User facing teams
– If you’re not using these properly, you’re hosed

• Session replay tools provide vital input
– Get vital additional customer evidence

• Simple page Analytics don’t cut it
– Invest in your analytics, especially event tracking

• Ego, Opinion, Cherished notions – fill gaps
– Fill these vacuums with insights and data

• Champion the user
– Give them a chair at every meeting
@OptimiseOrDie
#3 : No analytics integration

•
•
•
•
•
•

Investigating problems with tests
Segmentation of results
Tests that fail, flip or move around
Tests that don’t make sense
Broken test setups
What drives the averages you see?

@OptimiseOrDie
These
We still keep
Danish porn
watching our
sites are so
old AB tests in
hardcore!
retirement

• Use a test length calculator like this one:
#4 : The test will finish after you die
• visualwebsiteoptimizer.com/ab-split-test-duration/
#5 : You don’t test for long enough
• The minimum length
–
–
–
–
–

2 business cycles (comparison)
Always test ‘whole’ not partial cycles
Don’t self stop!
Usually a week, 2 weeks, Month
Be aware of multiple cycles

• How long after that
–
–
–
–
–
–
–
–

95% confidence or higher is my aim – and often hit higher than this
I aim for a minimum 250 outcomes, ideally 350+ for each ‘creative’
If you test 4 recipes, that’s 1400 outcomes needed
You should have worked out how long each batch of 350 needs before you start!
If you segment, you’ll need more data
It may need a bigger sample if the response rates are similar*
Use a test length calculator but be aware of minimums
Important insider tip – watch the error bars! The +/- stuff – let’s explain

* Stats geeks know I’m glossing over something here. That test time depends on how the two
experiments separate in terms of relative performance as well as how volatile the test
response is. I’ll talk about this when I record this one! This is why testing similar stuff sux. 0
2
#2 : The tennis court
– Let’s say we want to estimate, on average, what height Roger Federer
and Nadal hit the ball over the net at. So, let’s start the match:

@OptimiseOrDie
First Set Federer 6-4
– We start to collect values

63.5cm
+/- 2cm

62cm
+/- 2cm

@OptimiseOrDie
Second Set – Nadal 7-6
– Nadal starts sending them low over the net

62.5cm
+/- 1cm

62cm
+/- 1cm

@OptimiseOrDie
Final Set Nadal 7-6
– We start to collect values

62cm
+/- .3cm

61.8cm
+/- .3cm
Let’s look at this a different way

9.1 ± 0.3%
62.5cm
+/- 1cm

@OptimiseOrDie
Graph is a range, not a line:

9.1 ± 0.3%
#5 : Summary
• The minimum length:
–
–
–
–

2 business cycles minimum, regardless of outcomes
250+, prefer 350+ outcomes in each
95%+ confidence
Error bar separation between creatives

• Pay attention to:
–
–
–
–
–

Time it will take for the number of ‘recipes’ in the test
The actual footfall to the test – not sitewide numbers
Test results that don’t separate – makes the test longer
This is why you need brave tests – to drive difference
The error bars – the numbers in your AB testing tool are not precise –
they’re fuzzy regions that depend on response and sample size.
– Sudden changes in test performance or response
– Monitor early tests like a chef!
@OptimiseOrDie
www.crossbrowsertesting.com
www.browserstack.com
www.spoon.net
www.cloudtesting.com
#6 : No QA testing
www.multibrowserviewer.com
for the AB test?
www.saucelabs.com
#6 : What QA testing should I do?
•
•
•
•
•
•
•

Cross Browser Testing
Testing from several locations (office, home, elsewhere)
Testing the IP filtering is set up
Test tags are firing correctly (analytics and the test tool)
Test as a repeat visitor and check session timeouts
Cross check figures from 2+ sources
Monitor closely from launch, recheck, watch

@OptimiseOrDie
#7 : Opportunities are not prioritised
Once you have a list of
potential test areas, rank them
by opportunity vs. effort.
The common ranking metrics I
use include:

•Opportunity (profit, revenue)
•Dev resource
•Time to market
•Risk / Complexity
Make yourself a quadrant
diagram and plot them
#8 : Your cycles are too slow

Conversion

0

6

12

18

Months

@OptimiseOrDie
#8 : Solutions
• Give Priority Boarding for opportunities
– The best seats reserved for metric shifters

• Release more often to close the gap
– More testing resource helps, analytics ‘hawk eye’

• Kaizen – continuous improvement
– Others call it JFDI (just f***ing do it)

• Make changes AS WELL as tests, basically!
– These small things add up

• RUSH Hair booking – Over 100 changes
– No functional changes at all – 37% improvement

• Inbetween product lifecycles?
– The added lift for 10 days work, worth 360k
@OptimiseOrDie
#8 : Make your own cycles

@OptimiseOrDie
#9 : Your test fails

34

@OptimiseOrDie
#9 : Your test fails
• Learn from the failure! If you can’t learn from the failure, you’ve
designed a crap test.
• Next time you design, imagine all your stuff failing. What would
you do? If you don’t know or you’re not sure, get it changed so
that a negative becomes insightful.
• So : failure itself at a creative or variable level should tell you
something.
• On a failed test, always analyse the segmentation and analytics
• One or more segments will be over and under
• Check for varied performance
• Now add the failure info to your Knowledge Base:
• Look at it carefully – what does the failure tell you? Which
element do you think drove the failure?
• If you know what failed (e.g. making the price bigger) then you
have very useful information
• You turned the handle the wrong way
• Now brainstorm a new test
@OptimiseOrDie
#10 : The test is ‘about the same’
•
•
•
•
•

Analyse the segmentation
Check the analytics and instrumentation
One or more segments may be over and under
They may be cancelling out – the average is a lie
The segment level performance will help you (beware of
small sample sizes)
• If you genuinely have a test which failed to move any
segments, it’s a crap test – be bolder
• This usually happens when it isn’t bold or brave enough in
shifting away from the original design, particularly on
lower traffic sites
• Get testing again!
@OptimiseOrDie
#11 : The test keeps moving around
• There are three reasons it is moving around
– Your sample size (outcomes) is still too small
– The external traffic mix, customers or reaction has
suddenly changed or
– Your inbound marketing driven traffic mix is
completely volatile (very rare)

•
•
•
•

Check the sample size
Check all your marketing activity
Check the instrumentation
If no reason, check segmentation

@OptimiseOrDie
#11 : The test has flipped on me

•

•
•
•
•

Something like this can happen:

Check your sample size. If it’s still small, then expect this until the test settles.
If the test does genuinely flip – and quite severely – then something has changed with
the traffic mix, the customer base or your advertising. Maybe the PPC budget ran
out? Seriously!
To analyse a flipped test, you’ll need to check your segmented data. This is why you
have a split testing package AND an analytics system.
The segmented data will help you to identify the source of the shift in response to your
test. I rarely get a flipped one and it’s always something changing on me, without
being told. The heartless bastards.
#12 : Nobody feels the test
•
•
•
•
•
•
•
•

You promised a 25% rise in checkouts - you only see 2%
Traffic, Advertising, Marketing may have changed
Check they’re using the same precise metrics
Run a calibration exercise
I often leave a 5 or 10% stub running in a test
This tracks old creative once new one goes live
If conversion is also down for that one, BINGO!
Remember – the AB test is an estimate – it doesn’t
precisely record future performance
• This is why infrequent testing is bad
• Always be trying a new test instead of basking in the
glory of one you ran 6 months ago. You’re only as good
as your next test.
@OptimiseOrDie
#13 : You forgot you were responsive
•
•
•
•
•
•
•
•

If you’re AB testing a responsive site, pay attention
Content will break differently on many screens
Know thy users and their devices
Use bango or google analytics to define a test list
Make sure you test mobile devices & viewports
What looks good on your desk may not be for the user
Harder to design cross device tests
You’ll need to segment mobile, tablet & desktop
response in the analytics or AB testing package
• Your personal phone is not a device mix

@OptimiseOrDie
Top Fuckups for 2013
1. Testing in the wrong place
2. Your hypothesis inputs are crap
3. No analytics integration
4. Your test will finish after you die
5. Not testing for long enough
6. No QA for your split test
7. Opportunities are not prioritised
8. Testing cycles are too slow
9. Your test fails
10. The result is ‘about the same’
11. Test flips or moves around
12. Nobody ‘feels’ the test
13. You forgot you were responsive
@OptimiseOrDie
BONUS : What is a good conversion
rate?

Higher than the one
you had last month!

42
Is there a way to fix this then?

Conversion
Heroes!

43

@OptimiseOrDie
Slides at www.Slideshare.net/sullivac

Email : sullivac@gmail.com
Twitter : @OptimiseOrDie
: linkd.in/pvrg14
44

Contenu connexe

Plus de Craig Sullivan

Cross Device Optimisation - Google Analytics Shortcuts
Cross Device Optimisation - Google Analytics ShortcutsCross Device Optimisation - Google Analytics Shortcuts
Cross Device Optimisation - Google Analytics ShortcutsCraig Sullivan
 
Product Design is Poo - And we're all going to die
Product Design is Poo - And we're all going to dieProduct Design is Poo - And we're all going to die
Product Design is Poo - And we're all going to dieCraig Sullivan
 
Product design is Poo - And how to fix it!
Product design is Poo - And how to fix it!Product design is Poo - And how to fix it!
Product design is Poo - And how to fix it!Craig Sullivan
 
Conversion Research in One Hour
Conversion Research in One HourConversion Research in One Hour
Conversion Research in One HourCraig Sullivan
 
Web Analytics Wednesday - Session Replay Tools are Vital
Web Analytics Wednesday - Session Replay Tools are VitalWeb Analytics Wednesday - Session Replay Tools are Vital
Web Analytics Wednesday - Session Replay Tools are VitalCraig Sullivan
 
Interact London - 21 Oct 2015 - Scaling Stupidity
Interact London - 21 Oct 2015 - Scaling StupidityInteract London - 21 Oct 2015 - Scaling Stupidity
Interact London - 21 Oct 2015 - Scaling StupidityCraig Sullivan
 
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015Craig Sullivan
 
Surviving the hype cycle Shortcuts to split testing success
Surviving the hype cycle   Shortcuts to split testing successSurviving the hype cycle   Shortcuts to split testing success
Surviving the hype cycle Shortcuts to split testing successCraig Sullivan
 
Myths and Illusions of Cross Device Testing - Elite Camp June 2015
Myths and Illusions of Cross Device Testing - Elite Camp June 2015Myths and Illusions of Cross Device Testing - Elite Camp June 2015
Myths and Illusions of Cross Device Testing - Elite Camp June 2015Craig Sullivan
 
Myths, Lies and Illusions of AB and Split Testing
Myths, Lies and Illusions of AB and Split TestingMyths, Lies and Illusions of AB and Split Testing
Myths, Lies and Illusions of AB and Split TestingCraig Sullivan
 
20 Ways to Shaft your Split Tesring : Conversion Conference
20 Ways to Shaft your Split Tesring : Conversion Conference20 Ways to Shaft your Split Tesring : Conversion Conference
20 Ways to Shaft your Split Tesring : Conversion ConferenceCraig Sullivan
 
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to Me
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to MeBrighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to Me
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to MeCraig Sullivan
 
Digital Impact 2014 - Oh Boy These AB tests sure look like Bullshit to me
Digital Impact 2014 - Oh Boy These AB tests sure look like Bullshit to meDigital Impact 2014 - Oh Boy These AB tests sure look like Bullshit to me
Digital Impact 2014 - Oh Boy These AB tests sure look like Bullshit to meCraig Sullivan
 
20 top AB testing mistakes and how to avoid them
20 top AB testing mistakes and how to avoid them20 top AB testing mistakes and how to avoid them
20 top AB testing mistakes and how to avoid themCraig Sullivan
 
#Measurefest : 20 Simple Ways to Fuck Up your AB tests
#Measurefest : 20 Simple Ways to Fuck Up your AB tests#Measurefest : 20 Simple Ways to Fuck Up your AB tests
#Measurefest : 20 Simple Ways to Fuck Up your AB testsCraig Sullivan
 
#Measurecamp : 18 Simple Ways to F*** up Your AB Testing
#Measurecamp : 18 Simple Ways to F*** up Your AB Testing#Measurecamp : 18 Simple Ways to F*** up Your AB Testing
#Measurecamp : 18 Simple Ways to F*** up Your AB TestingCraig Sullivan
 
UXPA UK - Toolkits and Tips for Blending UX, Analytics and CRO
UXPA UK - Toolkits and Tips for Blending UX, Analytics and CROUXPA UK - Toolkits and Tips for Blending UX, Analytics and CRO
UXPA UK - Toolkits and Tips for Blending UX, Analytics and CROCraig Sullivan
 
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014Craig Sullivan
 
12 Things to do Before Your Company Dies : Conversion Conference London - Oct...
12 Things to do Before Your Company Dies : Conversion Conference London - Oct...12 Things to do Before Your Company Dies : Conversion Conference London - Oct...
12 Things to do Before Your Company Dies : Conversion Conference London - Oct...Craig Sullivan
 
Why does my Mobile Conversion rate suck? 19 Sep 2013 @ Conversion Thursday #...
Why does my Mobile Conversion rate suck?  19 Sep 2013 @ Conversion Thursday #...Why does my Mobile Conversion rate suck?  19 Sep 2013 @ Conversion Thursday #...
Why does my Mobile Conversion rate suck? 19 Sep 2013 @ Conversion Thursday #...Craig Sullivan
 

Plus de Craig Sullivan (20)

Cross Device Optimisation - Google Analytics Shortcuts
Cross Device Optimisation - Google Analytics ShortcutsCross Device Optimisation - Google Analytics Shortcuts
Cross Device Optimisation - Google Analytics Shortcuts
 
Product Design is Poo - And we're all going to die
Product Design is Poo - And we're all going to dieProduct Design is Poo - And we're all going to die
Product Design is Poo - And we're all going to die
 
Product design is Poo - And how to fix it!
Product design is Poo - And how to fix it!Product design is Poo - And how to fix it!
Product design is Poo - And how to fix it!
 
Conversion Research in One Hour
Conversion Research in One HourConversion Research in One Hour
Conversion Research in One Hour
 
Web Analytics Wednesday - Session Replay Tools are Vital
Web Analytics Wednesday - Session Replay Tools are VitalWeb Analytics Wednesday - Session Replay Tools are Vital
Web Analytics Wednesday - Session Replay Tools are Vital
 
Interact London - 21 Oct 2015 - Scaling Stupidity
Interact London - 21 Oct 2015 - Scaling StupidityInteract London - 21 Oct 2015 - Scaling Stupidity
Interact London - 21 Oct 2015 - Scaling Stupidity
 
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015
Surviving the AB Testing Hype Cycle - Reaktor Breakpoint 2015
 
Surviving the hype cycle Shortcuts to split testing success
Surviving the hype cycle   Shortcuts to split testing successSurviving the hype cycle   Shortcuts to split testing success
Surviving the hype cycle Shortcuts to split testing success
 
Myths and Illusions of Cross Device Testing - Elite Camp June 2015
Myths and Illusions of Cross Device Testing - Elite Camp June 2015Myths and Illusions of Cross Device Testing - Elite Camp June 2015
Myths and Illusions of Cross Device Testing - Elite Camp June 2015
 
Myths, Lies and Illusions of AB and Split Testing
Myths, Lies and Illusions of AB and Split TestingMyths, Lies and Illusions of AB and Split Testing
Myths, Lies and Illusions of AB and Split Testing
 
20 Ways to Shaft your Split Tesring : Conversion Conference
20 Ways to Shaft your Split Tesring : Conversion Conference20 Ways to Shaft your Split Tesring : Conversion Conference
20 Ways to Shaft your Split Tesring : Conversion Conference
 
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to Me
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to MeBrighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to Me
Brighton CRO Meetup #1 - Oh Boy These AB tests Sure Look Like Bullshit to Me
 
Digital Impact 2014 - Oh Boy These AB tests sure look like Bullshit to me
Digital Impact 2014 - Oh Boy These AB tests sure look like Bullshit to meDigital Impact 2014 - Oh Boy These AB tests sure look like Bullshit to me
Digital Impact 2014 - Oh Boy These AB tests sure look like Bullshit to me
 
20 top AB testing mistakes and how to avoid them
20 top AB testing mistakes and how to avoid them20 top AB testing mistakes and how to avoid them
20 top AB testing mistakes and how to avoid them
 
#Measurefest : 20 Simple Ways to Fuck Up your AB tests
#Measurefest : 20 Simple Ways to Fuck Up your AB tests#Measurefest : 20 Simple Ways to Fuck Up your AB tests
#Measurefest : 20 Simple Ways to Fuck Up your AB tests
 
#Measurecamp : 18 Simple Ways to F*** up Your AB Testing
#Measurecamp : 18 Simple Ways to F*** up Your AB Testing#Measurecamp : 18 Simple Ways to F*** up Your AB Testing
#Measurecamp : 18 Simple Ways to F*** up Your AB Testing
 
UXPA UK - Toolkits and Tips for Blending UX, Analytics and CRO
UXPA UK - Toolkits and Tips for Blending UX, Analytics and CROUXPA UK - Toolkits and Tips for Blending UX, Analytics and CRO
UXPA UK - Toolkits and Tips for Blending UX, Analytics and CRO
 
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014
The Neuromarketing Toolkit - Chinwag Psych - 4 Feb 2014
 
12 Things to do Before Your Company Dies : Conversion Conference London - Oct...
12 Things to do Before Your Company Dies : Conversion Conference London - Oct...12 Things to do Before Your Company Dies : Conversion Conference London - Oct...
12 Things to do Before Your Company Dies : Conversion Conference London - Oct...
 
Why does my Mobile Conversion rate suck? 19 Sep 2013 @ Conversion Thursday #...
Why does my Mobile Conversion rate suck?  19 Sep 2013 @ Conversion Thursday #...Why does my Mobile Conversion rate suck?  19 Sep 2013 @ Conversion Thursday #...
Why does my Mobile Conversion rate suck? 19 Sep 2013 @ Conversion Thursday #...
 

Dernier

DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clashcharlottematthew16
 
Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Manik S Magar
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenHervé Boutemy
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Commit University
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo DayH2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo DaySri Ambati
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024Stephanie Beckett
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyAlfredo García Lavilla
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024Lorenzo Miniero
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 3652toLead Limited
 

Dernier (20)

DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clash
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache Maven
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo DayH2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
H2O.ai CEO/Founder: Sri Ambati Keynote at Wells Fargo Day
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easy
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365
 

13 Easy AB and Split Test Screwups - Conversionista Meetup - Stockholm

  • 1. 13 Easy split testing f***ups 1 @OptimiseOrDie
  • 2. Top Fuckups for 2013 1. Testing in the wrong place 2. Your hypothesis inputs are crap 3. No analytics integration 4. Your test will finish after you die 5. Not testing for long enough 6. No QA for your split test 7. Opportunities are not prioritised 8. Testing cycles are too slow 9. Your test fails 10. The result is ‘about the same’ 11. Test flips or moves around 12. Nobody ‘feels’ the test 13. You forgot you were responsive @OptimiseOrDie
  • 3. @OptimiseOrDie • UX and Analytics (1999) • User Centred Design (2001) • Agile, Startups, No budget (2003) • Funnel optimisation (2004) • Multivariate & A/B (2005) • Conversion Optimisation (2005) • Persuasive Copywriting (2006) • Joined Twitter (2007) • Lean UX (2008) • Holistic Optimisation (2009) Was : Group eBusiness Manager, Belron Now : Consulting
  • 4. Timeline - 1998 1999 - 2004 2004-2008 2008-2012 @OptimiseOrDie
  • 5.
  • 6. #1 : You’re doing it in the wrong place @OptimiseOrDie
  • 7. #1 : You’re doing it in the wrong place There are 4 areas a CRO expert always looks at: 1. Inbound attrition (medium, source, landing page, keyword, intent and many more…) 2. Key conversion points (product, basket, registration) 3. Processes and steps (forms, logins, registration, checkout) 4. Layers of engagement (search, category, product, add) 1. 2. 3. 4. Use visitor flow reports for attrition – very useful. For key conversion points, look at loss rates & interactions Processes and steps – look at funnels or make your own Layers and engagement – make a model Let’s look at an example I’ve used recently @OptimiseOrDie
  • 9. Examples – Shoprush.com Search or Category Product Page Add to basket View basket Checkout Bounce Complete @OptimiseOrDie
  • 10. Examples – 16-25Railcard.co.uk Login to Account Content Engage Start Application Type and Details Eligibility Bounce Photo Complete @OptimiseOrDie
  • 11. 6.3 – Examples – Guide Dogs Content Engage Donation Pathway Donation Page Starts process Funnel steps Bounce Complete @OptimiseOrDie
  • 12. 6.3 – Within a layer Exit Page 1 Page 3 Page 2 Page 4 Wishlist Contact Page 5 Email Like Deeper Layer @OptimiseOrDie Micro Conversions
  • 13. #1 : You’re doing it in the wrong place • Get to know the flow and loss (leaks) inbound, inside and through key processes or conversion points. • Once you know the key steps you’re losing people at and how much traffic you have – make a money model. • Let’s say 1,000 people see the page a month. Of those, 20% (200) convert to checkout. • Estimate the influence your test can bring. How much money or KPI improvement would a 10% lift in the checkouts deliver? • Congratulations – you’ve now built the worlds first IT plan with a return on investment estimate attached! • I’ll talk more about prioritising later – but a good real world analogy for you to use: @OptimiseOrDie
  • 14. Think like a store owner! If you can’t refurbish the entire store, which floors or departments will you invest in optimising? Wherever there is: • • • @OptimiseOrDie Footfall Low return Opportunity
  • 15. #2 : Your hypothesis inputs are all wrong Insight - Inputs Opinion Cherished notions Marketing whims Cosmic rays Not ‘on brand’ enough Ego IT inflexibility Panic Internal company needs #FAIL Competitor change An article the CEO read Some dumbass consultant Competitor copying Dice rolling Guessing Knee jerk reactons Shiny feature blindness @OptimiseOrDie
  • 16. #2 : These are the inputs you need… Insight - Inputs Usability testing Forms analytics Search analytics Voice of Customer Market research Eye tracking Customer contact A/B and MVT testing Big & unstructured data Insight Social analytics Session Replay Web analytics Segmentation Sales and Call Centre Surveys Customer services Competitor evals @OptimiseOrDie
  • 17. #2 : Solutions • You need multiple tool inputs – Tool decks are here : www.slideshare.net/sullivac • Usability testing and User facing teams – If you’re not using these properly, you’re hosed • Session replay tools provide vital input – Get vital additional customer evidence • Simple page Analytics don’t cut it – Invest in your analytics, especially event tracking • Ego, Opinion, Cherished notions – fill gaps – Fill these vacuums with insights and data • Champion the user – Give them a chair at every meeting @OptimiseOrDie
  • 18. #3 : No analytics integration • • • • • • Investigating problems with tests Segmentation of results Tests that fail, flip or move around Tests that don’t make sense Broken test setups What drives the averages you see? @OptimiseOrDie
  • 19. These We still keep Danish porn watching our sites are so old AB tests in hardcore! retirement • Use a test length calculator like this one: #4 : The test will finish after you die • visualwebsiteoptimizer.com/ab-split-test-duration/
  • 20. #5 : You don’t test for long enough • The minimum length – – – – – 2 business cycles (comparison) Always test ‘whole’ not partial cycles Don’t self stop! Usually a week, 2 weeks, Month Be aware of multiple cycles • How long after that – – – – – – – – 95% confidence or higher is my aim – and often hit higher than this I aim for a minimum 250 outcomes, ideally 350+ for each ‘creative’ If you test 4 recipes, that’s 1400 outcomes needed You should have worked out how long each batch of 350 needs before you start! If you segment, you’ll need more data It may need a bigger sample if the response rates are similar* Use a test length calculator but be aware of minimums Important insider tip – watch the error bars! The +/- stuff – let’s explain * Stats geeks know I’m glossing over something here. That test time depends on how the two experiments separate in terms of relative performance as well as how volatile the test response is. I’ll talk about this when I record this one! This is why testing similar stuff sux. 0 2
  • 21. #2 : The tennis court – Let’s say we want to estimate, on average, what height Roger Federer and Nadal hit the ball over the net at. So, let’s start the match: @OptimiseOrDie
  • 22. First Set Federer 6-4 – We start to collect values 63.5cm +/- 2cm 62cm +/- 2cm @OptimiseOrDie
  • 23. Second Set – Nadal 7-6 – Nadal starts sending them low over the net 62.5cm +/- 1cm 62cm +/- 1cm @OptimiseOrDie
  • 24. Final Set Nadal 7-6 – We start to collect values 62cm +/- .3cm 61.8cm +/- .3cm
  • 25. Let’s look at this a different way 9.1 ± 0.3% 62.5cm +/- 1cm @OptimiseOrDie
  • 26. Graph is a range, not a line: 9.1 ± 0.3%
  • 27. #5 : Summary • The minimum length: – – – – 2 business cycles minimum, regardless of outcomes 250+, prefer 350+ outcomes in each 95%+ confidence Error bar separation between creatives • Pay attention to: – – – – – Time it will take for the number of ‘recipes’ in the test The actual footfall to the test – not sitewide numbers Test results that don’t separate – makes the test longer This is why you need brave tests – to drive difference The error bars – the numbers in your AB testing tool are not precise – they’re fuzzy regions that depend on response and sample size. – Sudden changes in test performance or response – Monitor early tests like a chef! @OptimiseOrDie
  • 28. www.crossbrowsertesting.com www.browserstack.com www.spoon.net www.cloudtesting.com #6 : No QA testing www.multibrowserviewer.com for the AB test? www.saucelabs.com
  • 29. #6 : What QA testing should I do? • • • • • • • Cross Browser Testing Testing from several locations (office, home, elsewhere) Testing the IP filtering is set up Test tags are firing correctly (analytics and the test tool) Test as a repeat visitor and check session timeouts Cross check figures from 2+ sources Monitor closely from launch, recheck, watch @OptimiseOrDie
  • 30. #7 : Opportunities are not prioritised Once you have a list of potential test areas, rank them by opportunity vs. effort. The common ranking metrics I use include: •Opportunity (profit, revenue) •Dev resource •Time to market •Risk / Complexity Make yourself a quadrant diagram and plot them
  • 31. #8 : Your cycles are too slow Conversion 0 6 12 18 Months @OptimiseOrDie
  • 32. #8 : Solutions • Give Priority Boarding for opportunities – The best seats reserved for metric shifters • Release more often to close the gap – More testing resource helps, analytics ‘hawk eye’ • Kaizen – continuous improvement – Others call it JFDI (just f***ing do it) • Make changes AS WELL as tests, basically! – These small things add up • RUSH Hair booking – Over 100 changes – No functional changes at all – 37% improvement • Inbetween product lifecycles? – The added lift for 10 days work, worth 360k @OptimiseOrDie
  • 33. #8 : Make your own cycles @OptimiseOrDie
  • 34. #9 : Your test fails 34 @OptimiseOrDie
  • 35. #9 : Your test fails • Learn from the failure! If you can’t learn from the failure, you’ve designed a crap test. • Next time you design, imagine all your stuff failing. What would you do? If you don’t know or you’re not sure, get it changed so that a negative becomes insightful. • So : failure itself at a creative or variable level should tell you something. • On a failed test, always analyse the segmentation and analytics • One or more segments will be over and under • Check for varied performance • Now add the failure info to your Knowledge Base: • Look at it carefully – what does the failure tell you? Which element do you think drove the failure? • If you know what failed (e.g. making the price bigger) then you have very useful information • You turned the handle the wrong way • Now brainstorm a new test @OptimiseOrDie
  • 36. #10 : The test is ‘about the same’ • • • • • Analyse the segmentation Check the analytics and instrumentation One or more segments may be over and under They may be cancelling out – the average is a lie The segment level performance will help you (beware of small sample sizes) • If you genuinely have a test which failed to move any segments, it’s a crap test – be bolder • This usually happens when it isn’t bold or brave enough in shifting away from the original design, particularly on lower traffic sites • Get testing again! @OptimiseOrDie
  • 37. #11 : The test keeps moving around • There are three reasons it is moving around – Your sample size (outcomes) is still too small – The external traffic mix, customers or reaction has suddenly changed or – Your inbound marketing driven traffic mix is completely volatile (very rare) • • • • Check the sample size Check all your marketing activity Check the instrumentation If no reason, check segmentation @OptimiseOrDie
  • 38. #11 : The test has flipped on me • • • • • Something like this can happen: Check your sample size. If it’s still small, then expect this until the test settles. If the test does genuinely flip – and quite severely – then something has changed with the traffic mix, the customer base or your advertising. Maybe the PPC budget ran out? Seriously! To analyse a flipped test, you’ll need to check your segmented data. This is why you have a split testing package AND an analytics system. The segmented data will help you to identify the source of the shift in response to your test. I rarely get a flipped one and it’s always something changing on me, without being told. The heartless bastards.
  • 39. #12 : Nobody feels the test • • • • • • • • You promised a 25% rise in checkouts - you only see 2% Traffic, Advertising, Marketing may have changed Check they’re using the same precise metrics Run a calibration exercise I often leave a 5 or 10% stub running in a test This tracks old creative once new one goes live If conversion is also down for that one, BINGO! Remember – the AB test is an estimate – it doesn’t precisely record future performance • This is why infrequent testing is bad • Always be trying a new test instead of basking in the glory of one you ran 6 months ago. You’re only as good as your next test. @OptimiseOrDie
  • 40. #13 : You forgot you were responsive • • • • • • • • If you’re AB testing a responsive site, pay attention Content will break differently on many screens Know thy users and their devices Use bango or google analytics to define a test list Make sure you test mobile devices & viewports What looks good on your desk may not be for the user Harder to design cross device tests You’ll need to segment mobile, tablet & desktop response in the analytics or AB testing package • Your personal phone is not a device mix @OptimiseOrDie
  • 41. Top Fuckups for 2013 1. Testing in the wrong place 2. Your hypothesis inputs are crap 3. No analytics integration 4. Your test will finish after you die 5. Not testing for long enough 6. No QA for your split test 7. Opportunities are not prioritised 8. Testing cycles are too slow 9. Your test fails 10. The result is ‘about the same’ 11. Test flips or moves around 12. Nobody ‘feels’ the test 13. You forgot you were responsive @OptimiseOrDie
  • 42. BONUS : What is a good conversion rate? Higher than the one you had last month! 42
  • 43. Is there a way to fix this then? Conversion Heroes! 43 @OptimiseOrDie
  • 44. Slides at www.Slideshare.net/sullivac Email : sullivac@gmail.com Twitter : @OptimiseOrDie : linkd.in/pvrg14 44

Notes de l'éditeur

  1. And here’s a boring slide about me – and where I’ve been driving over 400M of additional revenue in the last few years. In two months this year alone, I’ve found an additional ¾ M pounds annual profit for clients. For the sharp eyed amongst you, you’ll see that Lean UX hasn’t been around since 2008. Many startups and teams were doing this stuff before it got a new name, even if the approach was slightly different. For the last 4 years, I’ve been optimising sites using the combination of techniques I’ll show you today.
  2. Tomorrow - Go forth and kick their flabby low converting asses