Ce diaporama a bien été signalé.
Le téléchargement de votre SlideShare est en cours. ×

User Analytics Testing - SeleniumCamp 2015

Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Chargement dans…3
×

Consultez-les par la suite

1 sur 25 Publicité

User Analytics Testing - SeleniumCamp 2015

Télécharger pour lire hors ligne

Many companies are using Web & User Analytics, but few are actually testing whether or not the data is accurate. This presentation gives a brief introduction to the subject of analytics, and presents some general ideas for how a QA team can effectively get involved in ensuring they are being implemented properly.

The intended audience is Quality Assurance professionals who test public-facing web sites that seek to gather information via Google Analytics, Site Catalyst/Omniture, an internal/proprietary system, or some other way of gathering data. It favors test automation for testing, but no prior knowledge is required.

Many companies are using Web & User Analytics, but few are actually testing whether or not the data is accurate. This presentation gives a brief introduction to the subject of analytics, and presents some general ideas for how a QA team can effectively get involved in ensuring they are being implemented properly.

The intended audience is Quality Assurance professionals who test public-facing web sites that seek to gather information via Google Analytics, Site Catalyst/Omniture, an internal/proprietary system, or some other way of gathering data. It favors test automation for testing, but no prior knowledge is required.

Publicité
Publicité

Plus De Contenu Connexe

Diaporamas pour vous (20)

Similaire à User Analytics Testing - SeleniumCamp 2015 (20)

Publicité
Publicité

User Analytics Testing - SeleniumCamp 2015

  1. 1. User Analytics Testing Marcus Merrell, RetailMeNot, inc @mmerrell
  2. 2. What We’re Talking about  Overview of Web/User Analytics  Explanation of A/B Testing  Why this matters to you  Examples  How we test this stuff
  3. 3. User Analytics – The Basics Hits Sessions Users
  4. 4. User Analytics - Services
  5. 5. Web Analytics - Advanced  Conduct Experiments  Tell stories from disparate points of data  Incremental learning “If you are not paying for it… you are the product being sold” --Andrew Lewis (blue_beetle)
  6. 6. A/B Testing Explained  Basics  Don’t change everything at once  The Highball Incident
  7. 7. A Newer Example  Our 404 page had nothing on it  People landed there a lot by mis-typing store names  Should we put coupons on it?  A/B Test:  A: Control – No coupons on 404 page  B: Test – Coupons on 404 page  Keep it simple: top 15 coupons site-wide  Slice in 10% of traffic
  8. 8. RetailMeNot
  9. 9. User Analytics – Telling a Story  OK, so people did some clicking  How many?  How many resulted in a transaction?  The big question:  The amount of money we expect to make from coupons on the 404 page:  Is it worth the bandwidth? The load on the servers/database?  Is it worth the potential future maintenance of this page?
  10. 10. Drawing a Conclusion  Results after 2 weeks of testing tell us that the B variation won!  Enough people used coupons, so it justified the relatively low expense  Ergo: Continue to put coupons on the page  Promote “B” test to “A”  New A/B Test: should we indicate coupon popularity on the 404 page coupons?  A: Control – No popularity indication  B: Test – Indicate coupon popularity
  11. 11. Real-world Examples  Shopping cart—shipping & tax calculation  Suggesting products and content based on cookie, not login
  12. 12. Why You Should Care  What if the beacons they’re sending contain the wrong information?  But furthermore…  This is everywhere  It is only growing  Companies are becoming smart  (Really really smart)  You do not want to miss this opportunity to provide value
  13. 13. Why You Should Really Care  As a tester:  There is a team of people working on this  It gets worked into features as they are developed  It is rarely called out separately in a scheduled task  It rarely receives QA outside of the PM and BI people who really care about it *This is anecdotal, but I have yet to be told I’m wrong
  14. 14. Fortunately, It’s Easy  Usually one extra HTTP request, made during a navigation event  Intercept this request, then verify the data within it
  15. 15. Examples  Wells Fargo (s.gif)  Amazon (a.gif)  Netflix ([….]?trkId=xxx, beacon?s=xxx)  The New York Times (pixel.gif, dcs.gif)  OpenTable (/b/ss/otcom)  (and RetailMeNot – can you find it?)
  16. 16. Classic Approach  Marketing asks the BI team to figure out our ROI on TV ads during a period of time  BI requests PM to create a series of analytics  PM gives Dev the particulars  Dev assigns the code task to the newest person on the team  If anyone tests it, it’s also the newest person on the team
  17. 17. Classic Approach  Manual testing of web analytics is about as exciting as reconciling a large column of data with another large column of data  …what if it’s wrong?  …what if it changes?  …why not let the software do it?
  18. 18. What We Do
  19. 19. Test Cycle @Test Launch Browser Navigate to Position Start Proxy Perform Main Action Stop Proxy Clean-up Test
  20. 20. Execution CI Maven TestNG
  21. 21. Our Tech Stack
  22. 22. Reporting  Report to a dashboard  Indicates “PASS”, “FAIL”, and “Staleness”
  23. 23. Conclusion  User Analytics are your CEO’s favorite subject!  Deliver real value—million-dollar decisions are made with this data  Can be implemented with just as many bugs as any other kind of software
  24. 24. Questions?  @mmerrell  mmerrell@rmn.com

×