12. REFERENCES Risks & riddles. By Gregory F. Treverton, Smithsonian magazine, June 2007 http://www.smithsonianmag.com/people-places/presence_puzzle.html#ixzz1CPqCGaa7 Open secrets. Malcolm Gladwell, The New Yorker, January 2007 http://www.newyorker.com/reporting/2007/01/08/070108fa_fact About Me Lucy Spence LOVEFiLM, Head of Product (Acquisition & Retention) Twitter: @lucyjspence
Notes de l'éditeur
Post Sept 11: Puzzles & MysteriesGregory Treverton, former Vice Chair of the US National Intelligence Council made the distinctionI encountered the concept via Malcolm Galdwell’s“What the Dog Saw” in the chapter “Open Secrets”
Puzzles are binary: Cold war – specific information / activities about the soviet union Osama Bin Laden’s whereabouts.Individuals discover key pieces of information and persistence will get there in the end
Mysteries don’t have clear cut answers and evolve:What happens when you invade a country or support regime change? How do you detect terrorist cells and their activities.Vast amounts of information are available (SMS, email, credit cards, travel arrangements, accommodation, etc). Deciphering it and detecting what’s important is difficult. Often the level of noise in data obscures what’s interesting.
Are puzzles. It’s where a lot of qualitative research techniques and the user experience industry has evolved from. Research tends to produces reliable results that we can use to solve problems and the extent to which the problem is fixed is reasonably predictable.
Are mysteries: Often when you think you’ve solved one problem it just resurfaces elsewhere.We now have vast number of data and tools to interpret results but it’s hard to find actionable insight.
Ground hog day – struggling to find clear answers in data.The opposite of the virtuous circle, the wicked whirlpool of tools: The business wants more definitive answers and more predictable out comesIf the answers aren’t apparent use more sophisticated tools and get more dataIf no more insight is achieved repeat the previous step.More time is spent implementing, learning the tools and analysing and less time solving problems by actually designing, building, experimenting and improving.
It’s often difficult to find answers because:Tools usually don’t measure what we actually want to know so we have to do complex inferences It’s easy to start chasing answers and continually looking at more segments to find some explanationCan easilyloose focus on what you’re actually trying to do
Pick out the small pieces that are clear areas to focus on (we usually do this already and call them ‘quick wins’)Characteristics of these are usually: Clear / obvious metrics for successEasily predictable solutions Little contention about approachSo get these out of the way as quickly as you can
Recognise when it’s part of a bigger journey and you could just be moving the weakest point to another part of the system. Change approach or at least recognise that that will happen in advance.
Think about designing for experiments instead of designing just one solution and then test radically different solutionsFor a bit more effort upfront you can often get a framework which allows for multiple solutions and if you’re going to be iterating then it’ll save you a lot in the long run. Work with your developers on how you can maximise your flexibility.
Learn different analysis techniques. Too often we assume by just looking at things we will be able to see the meaning in it. Yet most of us have limited analytical skills, especially numeric ones.Understand: Standard deviation and statistical significanceThe pitfalls of averagesCorrelationsControl charts and control limits (but you’ll need to know about different types of distribution as well for this)Apply:Kano modelling NPSKeep up with the new data visualisation tools that can really help understanding.