I examine the information security industry through the lens of behavioral models. Traditional ways of thinking about defensive and offensive motivations focus on models such as game theory, which tend to assume the people on each side are “rational” actors. However, humans have lots of quirks in their thinking that are the result of cognitive biases, and that lead to “irrational” behaviors.
The question therefore is: what biases do defenders and attackers have when they make decisions, and how can we leverage these insights to improve the efficacy of defense? In particular, I’ll discuss what implications theories such as Prospect Theory, time inconsistency, less-is-better effect, sunk cost fallacy, dual system theory and social biases such as fairness and trust have for why the industry dynamics are the way they are.
Given at NCC Group's Security Open Forum on August 17, 2016
5. Common complaints about infosec
“Snakeoilserved overwordsalads”
Hype overAPTvs. actualattacks
Notlearningfrom mistakes
Notusingdata to informstrategy
Playingcat-and-mouse
5
6. “If you can’t handle me at my
worst, you don’t deserveme at
my best”
– Sun Tzu
6
7. Mygoal
Starta different type of discussiononhowto fix
the industry,based onempiricalbehaviorvs. how
people “should”behave
Focusonthe framework; myconclusionsarejust a
startingpoint
Stopshaming defenders for commonhuman
biases;you probablysuck at dieting,bro
(alsoI’llshowoffsome bad amazingcyber art)
7
16. What are the outcomes?
Criminallyunder-adoptedtools:EMET,2FA,
canaries,white-listing
Criminallyover-adoptedtools:anti-APT,threat
intelligence,IPS/IDS,dark-web anything
16
17. Incentive problems
Defenders can’teasilyevaluatetheir current
securityposture, risklevel,probabilitiesand
impacts of attack
Defenders onlyfeelpain in the massivebreach
instance,otherwise“meh”
Attackersmostly can calculatetheirposition;their
weaknessisthey feellosses 3x as muchas
defenders
17
31. Dual-system theory
MindSystem 1:automatic,fast, non-conscious
MindSystem 2:controlled,slow,conscious
System1 is often dominantindecision-making,
esp. withtime pressure, busyness,positivity
System2 ismore dominantwhenit’spersonaland
/ or the person isheld accountable
31
35. Improving heuristics: industry-level
Only hype “legit” bugs / attacks (availability): very unlikely
Proportionally reflect frequency of different types of
attacks (familiarity): unlikely, but easier
Publish accurate threat data and share security metrics
(anchoring): more likely, but difficult
Talk more about 1) the “boring” part of defense / unsexy
tech that really works 2) cool internally-developed tools
(social proof): easy enough
35
37. Leveraging attacker weaknesses
Attackers are riskaverse andwon’tattackif:
— Toomuchuncertainty
— Costs toomuch
— Payoffistoo low
Blocklow-costattacksfirst,minimizeabilityfor
recon,stop lateralmovement and abilityto “one-
stop-shop”for data
37
38. How to promoteSystem 2
Holddefenders extra accountablefor strategic
and productdecisionstheymake
Make itpersonal:don’tjustcheckboxes,don’t
settleforthe status quo, don’tbe a sheeple
Leveragethe “IKEAeffect” – people valuethings
more whenthey’veput laborintothem(e.g.build
internaltooling)
38
39. Inequity aversion
Peoplereallydon’tlikebeingtreated unfairly
e.g.A is given$10 andcanshare some portion$X
withB, whowillget$X* 2. B thenhasthesame
optionback
— NashEquilibriumsays Agives $0 (self-interest)
— Actualpeople send ~50% to playerB,andB
generallysends more back to A thanreceived
39
40. Inequity aversion in infosec
Maymean defenders willbe willingto sharedata,
metrics, strategies
Notnecessarilythe“aslongasI’mfaster than
you”mentalitythatis commonlyassumed
Keyis toset expectationsofanongoing“game”;
repeated interactionspromotes fairness
So,foster acloser-knitdefensive communitylike
there exists for vulnresearchers
40