Pdf analytics-and-witch-doctoring -why-executives-succumb-to-the-black-box-mentality-presentation
1. Analytics and Witch Doctoring:
A Cure for the Black Box
Mentality
February 1, 2011
O’Reilly Strata Conference
J.C. Herz, Batchtags LLC
jc@tripledex.com
7. High Status Helplessness
• If you understood the technology, you’d
be one of those people whose job it is
to make technology work.
• You know, underlings
8. Executive ADD - Kaching!
• Re-starts are where
consulting shops
make their money
10. Shiny Pebble Syndrome
• Infoviz Porn: Visualization with no
use case
– Ex: Social Network visualization. Why?
– START with a use case and work
forward
• Demo Envy: just because it looks
slick doesn’t mean it’s possible, or
even advisable, to pipe your data
into it.
11. A Ballad of Spectacular
Information Display
• Time Magazine 1976
• Telex text routing:
information off the wire
goes to terminals, properly
foldered
• Z8 terminal display awes
executives
• Pneumatic system not
eliminated
13. Half Ass Syndrome
• Halfway into the project, jump off into
the next problem.
• Haven’t refined results or hypothesis
• Failure blamed on technology, but it’s
really loss of interest and desire for
instant gratification
14. Shelfware Syndrome
• The guy who was driving the program left...
• Approach-Avoidance conflict --> pilot-itis
• A US agency has $30M of software that
hasnʼt been installed…some of it with
maintenance contracts.
• Base Model vs. Fully Loaded
– One enterprise bought $12M worth of
Autonomy before figuring out that the add-ons
they needed would be another $22M.
16. Critical Question: What is the
Validation Test?
• Formulating the validation test keeps
both the customer and the developer
focused - and honest
• Suggest pay for performance, and see if
the developer or vendor freaks out.
• Make sure validation is ongoing - in
case the ground is shifting
20. Critical Question: Data Quality
• How complete is it?
– Ex: 600 custom fields, only two have more than 50%
coverage
• How accurate is it? How do you know?
• How consistent is it?
– Good test: make three calls to different parts
of the company, to get an answer to a factual
question that doesn’t require calculation.
22. Critical Question: Real World Context
• Without real world data, “behavioral” metrics
are misleading
• Where is the transactional data that validates
insights from non-transactional data?
• How would you prove the magic analytics
WRONG?
23. • Are you prepared to spend painful
amounts of money cleaning up your data?
• Crack heads if people don’t share data?
• Make business units accountable for their
data?
• Play hardball to make sure data is not
stored in single-application proprietary
formats?
Data: Gut Check
24. Critical Questions: Workflow
• What workflow changes will this
proposed capability require?
• People hate changing their workflow,
even if it’s an improvement
• Never attribute to stupidity what can be
attributed to laziness
• What is your plan for changing
workflow? How do you enforce it?
29. Critical Question: Consequences
• What actions are you willing to take on
the basis of validated analytic insight?
– Change your product?
– Change your marketing budget?
– Change people’s job descriptions?
– Re-allocate R&D budgets?
• What actions are you not willing to take?
31. Critical Question: Tempo
• How fast will a decision be made on the
basis of analytic insight?
• Quarterly?
• Daily?
• Within seconds?
• Milliseconds?
• Never?
• Realtime vs. Continuous vs. Batch
34. Before You Rip ‘n’ Replace:
What is the exit cost of this
technology?
Does “turnkey” mean
monoculture?
35. Business Payoff vs.
Intellectual Appeal
Social Network
Analysis
Operations
Research
Market
Segmentation
Competitive
Intelligence
Pilots to test new
analyst tools with
tiny amounts of
generic data
360º
Lead
Scoring
Validate Marketing
Effectiveness