Ce diaporama a bien été signalé.
Nous utilisons votre profil LinkedIn et vos données d’activité pour vous proposer des publicités personnalisées et pertinentes. Vous pouvez changer vos préférences de publicités à tout moment.

Continuous Improvement (GroupOn, Palo Alto 2013)

39 430 vues

Publié le

Slides from my talk at the GroupOn office in Palo Alto, CA. http://bit.ly/X4NDxt

Publié dans : Technologie

Continuous Improvement (GroupOn, Palo Alto 2013)

  1. Continuous ImprovementHow Continuous Delivery is changing Quality Assurance.GroupOnPalo AltoJanuary 15 2013Noah Sussmanns@noahsussman.com@noahsussman
  2. The canonical Agile release cycle film still from The Lord of the Rings
  3. Sprints of two or more weeks in length Cocento Tecnologia on Flickr
  4. Start deployment once the sprint is over TheMasonDixon on Etsy
  5. QA is part of the release process SisterDimension on Flickr
  6. QA sign-off is required before going live film still from The Lord of the Rings
  7. The Continuous release cycle evoo73 on Flickr
  8. Minimum viable feature set Travis S. on Flickr
  9. Releasing a featureis decoupled fromdeploying code. David E. Smith on Flickr
  10. An airport withoutan air traffic controller. —Chad Dickerson
  11. Real-time data on how releases impact revenue Etsy
  12. Default to open access
  13. Constant tweaks to live features
  14. Large features are deployed piecemeal over time dogpose on Flickr
  15. Every feature is part of an A/B campaign NASA
  16. Dark launches Joy and Jon
  17. Opt-in experiments NASA
  18. Partial rollouts NASA
  19. Config Flags Wikipedia
  20. Wire-Offs Joe Thomissen on Flickr
  21. if ($cfg["new_search"]) { // new hotness $resp = search_solr();} else { // old busted $resp = search_grep();}
  22. $cfg = array( checkout => true, homepage => true, profiles => true, new_search => false,);
  23. There is no “done done” NASA
  24. Observed Behavior Of Complex Systems NASA
  25. Emergent behaviors require unplanned responses NASA
  26. Improvements are discovered rather than designed NASA
  27. Users of the system have complex expectations NASA
  28. Complex systems are never “complete” NASA
  29. QA Happens When? NASA
  30. First of all, what is “Quality Assurance?” NASA
  31. QA: assuring that there are no defects? NASA
  32. It is impossible to prove the absence of defects NASA
  33. There will always be bugs in production Lukjonis
  34. Testing is everyone’s job. Library of Congress
  35. Myths About Bug Detection The Jargon File
  36. Myth: there are a finite number of bugs
  37. Myth: here are a finite number of detectable bugs niscratz on Flickr
  38. Myth: all severity one bugs can be found before release NASA
  39. Myth: software is built to specifications Fred Brooks at Etsy
  40. Myth: at some point, software is finished
  41. Myth: most bugs have complex, unpredictable causes
  42. The animistic metaphor of the bugthat maliciously sneaked in while theprogrammer was not looking...disguises that the error is theprogrammers own creation. — Edsger Dijkstra
  43. The whole time I’m programming, I’mconstantly checking my assumptions. —Rasmus Lerdorf @loriabys
  44. As youre about to add a comment, askyourself, “How can I improve the codeso that this comment isnt needed?”Improve the code and then documentit to make it even clearer. — Steve McConnell
  45. Debugging is twice as hardas writing the code in thefirst place. Therefore, ifyou write the code ascleverly as possible, youare, by definition, notsmart enough to debug it. —Brian Kernighan
  46. No blame
  47. Many Small Anomalies Combined
  48. An organizations defenses against failureare a series of barriers, represented asslices of swiss cheese. The holes in thecheese represent weaknesses inindividual parts of the system. Failuresoccur when a hazard passes through allof the holes in all of the defenses. — Wikipedia
  49. John Allspaw
  50. Prioritize the elimination of small errors John Allspaw
  51. Focus less on mitigation of large, catastrophic failures
  52. Optimize for recovery rather than failure prevention. Failure is inevitable.Richard Avedon
  53. Unit testing is great for preventing small errors John Allspaw
  54. Resilience, Not “Quality” John Allspaw
  55. Readable code NASA
  56. Reasonable test coverage NASA
  57. Sane architecture NASA
  58. Good debugging tools NASA
  59. An engineering culture that values refactoring NASA
  60. Measurable goals NASA
  61. Manual Testing.But probably not the kind you’re thinking of. NASA
  62. Real-Time Monitoring is the new face of testing NASA
  63. Etsy
  64. Etsy
  65. Etsy
  66. Anomaly detection is hard. Greg and Tim Hildebrandt
  67. Watching the graphs NASA
  68. As of 2012, Etsy collected well overa quarter million real-time metrics NASA
  69. Deciding which metrics matter is a human problem NASA
  70. Everyone watches some subset of the graphs NASA
  71. Human vision is an excellent tool for anomaly detection NASA
  72. QA happens when??? NASA
  73. Exploratory testing can be performed at any time NASA
  74. Rigorous, scientific approach NASA
  75. Focus on customer satisfaction NASA
  76. Less focus on product specifications NASA
  77. Exploratory Testing is equally usefulbefore or after a release NASA
  78. Just Quality NASA
  79. “Assurance” is a terrible word.Let’s discard it. NASA
  80. Quality exists, but it’s tricky to assure or prove that NASA
  81. There’s no such thing as a formal proof of quality NASA
  82. Most of us would agree that quality exists NASA
  83. “Customer Experience” is a better term of art than “Quality” NASA
  84. Customer Experience.Though there’s no formal proof for that, either. NASA
  85. Exploratory Testing addresses areas that Developer Testing doesn’t NASA
  86. Developer Testing validates assumptions NASA
  87. The Independent Tester’s job is to invalidate assumptions NASA
  88. Technology Informs Customer Experience NASA
  89. Exploratory Testing requires an understanding of thewhole system NASA
  90. Exploratory Testing requires understanding how thesystem serves a community of users NASA
  91. Customer Experience is as much about technology as itis about product requirements NASA
  92. Most bugs, most of the time, are easilynailed given even an incomplete butsuggestive characterization of theirerror conditions at source-code level. — Eric S. Raymond
  93. Source, diffs, logs.If your QA Analysts don’t look at these — teach them. NASA
  94. Customer Support NASA
  95. Your customer support operators spend more timetalking to your users than anyone else NASA
  96. Customer Support interface with users as individualsrather than as aggregate data NASA
  97. Keep the feedback loop short.
  98. Manage Your Culture.
  99. Effeciency To Thoroughness Trade-Off NASA
  100. Rapid release cycles have different risks thanslower release cycles NASA
  101. Continuous Delivery does not alter the fundamentalnature of risk NASA
  102. Test in both dev and prod NASA
  103. Detectable errors should be caught in dev NASA
  104. Undetectable errors must be worked out in production NASA
  105. Software exists in context NASA
  106. Networks, services and people are always in flux NASA
  107. Small changesets are easier to debug
  108. An SCM revert is a changeset NASA
  109. Large changesets are riskier and harder to debug NASA
  110. Fail Forward! NASA
  111. Always deploy the HEAD revision of trunk scrapnow on Etsy
  112. Never roll back to an earlier state.Always roll forward. When itsdesireable to revert a previouschange, do that as part of a newcommit.
  113. Instead of rolling back, fix the problem and move on NASA
  114. Let go of the idea of “last stable release” NASA
  115. Focus less on satisfying the requirements Scott Holloway
  116. Watch the graphs NASA
  117. Listen to your customers NASA
  118. Build a culture of shared responsibility Kirsten Dunst on the set of Marie Antoinette
  119. Low-Ceremony Process Kirsten Dunst on the set of Marie Antoinette
  120. Iteratively improve your product WSHS Science blog
  121. Further Reading“How Google Tests Software,” James Whittaker“Look At Your Data,” John Rausser“Optimizing For Developer Happiness,” Chad Dickerson“Outages, Postmortems and Human Error,” John Allspawhttp://en.wikipedia.org/wiki/Swiss_cheese_model“What Is Exploratory Testing?,” James Bach
  122. Questions? @noahsussman ns@noahsussman.com infiniteundo.com