2. Agenda / Your Mission
• Introduction
• Maskirovka / Information Operations
• Deep Fakes
• Disinformation
• False Narratives
• Getting familiar with examples
• Emotional Aspects
• Combating IOs
• Semester Project Option
3. Your guide
Heather Vescent
CEO/Researcher,
The Purple Tornado
Strategic Intelligence Consultancy
• Author, 3 DHS Funded ID Reports
• Author, Cyber Attack Manual
• Author, SSI Report
Filmmaker, 14 Films
4. My path to Maskirovka obsession
• 90s: Adbusters anti-advertising
• 90s-00s: Culture Jamming
• 90s-00s: Cacophony Society + Dada
• 90s-00s: Disinformation as art
• 10s-20s: Futurist/Researcher
• 2017: Gamergate/Trolling – Cyber Attack Manual
• 2018: DHS research award: Securing Voter Data
• 2018: Information Operations
• 2019: Maskirovka – Defcon
• 2018-9: Deep Fakes – Voice, Text & Video
• 2019: False Narratives – Impeachment
Ability for us to make decisions free of manipulation
5. Culture Jamming
• Anti-consumerist (from Adbusters)
• Challenge existing socio-cultural beliefs / the consumer
culture that was the focus in the 80s & 90s
• Examples
Banksy
Billboard Liberation Group (BLG)
6. Advertising
• Create the desire for a product, aka manipulating an
audience to need to purchase what is advertised.
• Primary business model for the internet
• Targeted customers = prove data to advertisers = charge
more $$
• Collect, share, sell data for more targeted advertising = $$$
• Develop technology prove targeting even better: cookies,
SEO optimization, beacons, data profiles, buy/sell data,
know your identity
• Can the same tools/data be used as a weapon?
• By a nation state
• Voter Data + Data from brokers = detailed voter profiles aka
Cambridge Analytica
7. Propaganda
Biased information to promote a particular political point of
view.
• From a Nation State (U.S., China, N. Korea)
• From a political perspective (RT.com – Russia
Federation)
• Examples:
8. Social Engineering
Infosec term: psychological manipulation
• Business Email Compromise / CEO Fraud
• Pathé – 20M Euro transfer to hackers
• Friend’s $40M transfer to Romanian Bank
• Deep Fake with synthetic voice:
https://www.wsj.com/articles/fraudsters-use-ai-to-mimic-ceos-voice-
in-unusual-cybercrime-case-11567157402
• Gaining entrance into secure areas (PenTesters)
• Cell phone company
• Changing access of your phone (to gain access to your 2 factor)
• https://www.youtube.com/watch?v=lc7scxvKQOo
• Manipulation and deception
9. When Nation States use these methods
• Different motivations
• Different goals
• Different resources ($$, people, other items)
• Play the long game
• Private sector is notoriously short term thinking
constrained
• Not about winning in $$ or economics game:
• Belief
• Action
• Power
10. Question
Who interfered with the 2016 election?
A. Ukraine
B. Yemen
C. Russia
D. Sudan
(hold your answers until I tell you to post them)
11. Unintended Consequences
• Technology is used in novel ways
• Technologists tend to develop in a vacuum
• Functional
• But technology influences the world, and the world
influences technology
Unintended consequences are unforeseen and
primarily undesirable outcomes to an action or
decision.
12. Foresight Method: STEEPV
S Social Demographics, Lifestyle, Education, Health
T Technology Rate of tech progress, Moore’s Law
E Economic Growth, Stagnation, K-Waves, Markets
E Environmental Climate Change, Sustainability, Biodiversity
P Political (incl Military) Dominant views, military actors
V Values Traditional, Modern, Post-Modern, Integral
Adapted and expanded from Dr. Peter C. Bishop and Andy Hines, 2013
14. Maskirovka
Maskirovka is the Russian term to describe the various
methods of Russian information operations.
• To mask, camouflage, deceive
• Camouflage the truth
• Rose colored glasses, RT colored glasses
Information operations are efforts to use
information to influence a target audience.
15. Limited History of Maskirovka
• Military perspective
• Documented in use since the 1990s
• Best suited for unevenly matched adversaries
• Russian uses in times of war AND peace
• Internet is a force multiplier, easier & cheaper entry
• Media also supports Maskirovka objectives – because
they are sensational and evoke emotion -- and clicks.
• Our social media algos amplify what is popular – no
matter whether it’s true or not.
Daniel Bagge’s book
Unmasking Maskirovka
16. Types of Information Operations
• Disinformation
• Bot Networks
• Deepfakes
• False/Fictional Narratives
• Active Measures
• Reflexive Control
The goal is to shape your view of reality to one where
you make decisions that benefit your adversary.
17. Interrupt your decision making process
• Make (incorrect) decisions based on your (misled)
understanding of reality
• Shift Overton Window
• Overload you with information
• What is real
• What can I trust
• Analysis Paralysis
• Disengage & retreat
• Like a buffer overflow attack
but for decision making/rationality.
18. Disinformation
• Spreading false information specifically to mislead
• Coronovirus – Covid-19 example
• Just like the Flu or cause for concern?
• Hand sanitizer or soap?
• Is Autism created by vaccines?
• CDC says no:
https://www.cdc.gov/vaccinesafety/concerns/a
utism.html
20. Bot Networks
• Bot networks amplify and promote certain messaging
(not unlike paid advertising!)
• Deception: they aren’t real people
• Potentially under opposition/adversary control
• Act like participants but share same talking points
• Copy/paste
• Can understand the false narrative by watching what is being
promoted
• Thousands on twitter
21. Deepfakes
• Deep learning + fake = Deepfake
• Synthetic and manipulated media
• The Bill Hader / Tom Cruise deepfake:
https://youtu.be/VWrhRBb-1Ig?t=109
• Sherin Mathews & Amanda House @ RSA:
https://www.youtube.com/watch?v=DGdY-UWOfoo
22. False Narratives
A false narrative is the goal/result of an information
operation. It is a story that when believed, causes its target
(believer) to take actions that benefit the adversary (the
initiator of the false narrative)
24. Information Operations Kill Chain
• Kill Chain: the steps you go
through to target & achieve
your objective
• 8 Steps for InfoOps
• Bruce Schneier
• Security Professional
• www.schneier.com
25. Information Operations Kill Chain #1
1. Find the cracks in the fabric of society — the social,
demographic, economic and ethnic divisions.
• Edges of political beliefs (Trump/Sanders)
• Black Lives Matter
• Climate Change
• Can we trust the election?
26. Information Operations Kill Chain #2
2. Seed distortion by creating alternative narratives.
• In the 1980s, this was a single “big lie,” but today it is
more about many contradictory alternative truths — a
“firehose of falsehood” — that distorts the political
debate.
27. Information Operations Kill Chain #3
3. Wrap those narratives around kernels of truth. A core of
fact helps the falsities spread.
• Taking information out of context
• Election security: Illinois database hack example
28. Information Operations Kill Chain #4
4. Build audiences, either by directly controlling a platform
(like RT) or by cultivating relationships with people who will
be receptive to those narratives.
• You can also game a social platform through advertising
and using bots to promote specific content – not unlike
SEO.
• Anyone can start a podcast, YT/IG channel
• RT.com
• Academic publishing can be gamified
29. Information Operations Kill Chain #5
5. Conceal your hand; make it seem as if the stories came
from somewhere else.
• Wikileaks
• Election security: DNC hack narratives / Who did it?
• See step 6 (next)
30. Information Operations Kill Chain #6
6. Cultivate “useful idiots” who believe and amplify the
narratives. Encourage them to take positions even more
extreme than they would otherwise.
• Amplify people who share the disinformation
• Use bots to amplify and deceive about the size of
interest
31. Information Operations Kill Chain #7
7. Deny involvement, even if the truth is obvious.
• Election interference narrative: Russia or UK?
32. Information Operations Kill Chain #8
8. Play the long game. Strive for long-term impact over
immediate impact.
• For election interference, it’s been happening since
2016, and we’re living through it now.
33. Activity: Identify your own
Take 5 minutes – go into your favorite social media app –
can you find/identify/agree upon an example of an
information operation:
• Advertising
• Propaganda
• Disinformation
• Deepfake
• False Narrative
40. Strategies
• As a society
• Through the technology platform
• As Individuals
41. Revisiting Bruce Schneier
Step 1: Find the cracks.
There will always be open disagreements in a democratic society, but
one defense is to shore up the institutions that make that society
possible. Elsewhere I have written about the “common political
knowledge” necessary for democracies to function.
We need to strengthen that shared knowledge, thereby making it
harder to exploit the inevitable cracks.
We need to make it unacceptable—or at least costly—for domestic
actors to use these same disinformation techniques in their own
rhetoric and political maneuvering, and to highlight and encourage
cooperation when politicians honestly work across party lines.
We need to become reflexively suspicious of information that makes
us angry at our fellow citizens. We cannot entirely fix the cracks, as
they emerge from the diversity that makes democracies strong; but we
can make them harder to exploit.
42. Revisiting Bruce Schneier
Step 2: Seed distortion.
We need to teach better digital literacy.
• S.I.F.T.
This alone cannot solve the problem, as much sharing of
fake news is about social signaling, and those who share it
care more about how it demonstrates their core beliefs than
whether or not it is true. Still, it is part of the solution.
43. Revisiting Bruce Schneier
Step 3: Wrap the narratives in kernels of truth.
Defenses involve exposing the untruths and distortions,
but this is also complicated to put into practice.
Psychologists have demonstrated that an inadvertent effect
of debunking a piece of fake news is to amplify the
message of that debunked story.
• Backfire effect
Hence, it is essential to replace the fake news with
accurate narratives that counter the propaganda. That
kernel of truth is part of a larger true narrative. We need to
ensure that the true narrative is legitimized and promoted.
44. Revisiting Bruce Schneier
Step 4: Build audiences.
This is where social media companies have made all the
difference. By allowing groups of like-minded people to find
and talk to each other, these companies have given
propagandists the ability to find audiences who are
receptive to their messages.
• (Pop your) Filter bubbles
Here, the defenses center around making disinformation
efforts less effective. Social media companies
need to detect and delete accounts belonging to
propagandists and bots and groups run by those
propagandists.
45. Revisiting Bruce Schneier
Step 5: Conceal your hand.
Here the answer is attribution, attribution, attribution. The
quicker we can publicly attribute information operations, the
more effectively we can defend against them.
This will require efforts by both the social media platforms and
the intelligence community, not just to detect information
operations and expose them but also to be able
to attribute attacks.
Social media companies need to be more transparent about
how their algorithms work and make source publications more
obvious for online articles. Even small measures like the Honest
Ads Act, requiring transparency in online political ads, will help.
Where companies lack business incentives to do this, regulation
will be the only answer.
46. Revisiting Bruce Schneier
Step 6: Cultivate useful idiots.
We can mitigate the influence of people who disseminate
harmful information, even if they are unaware they are
amplifying deliberate propaganda. This does not mean that the
government needs to regulate speech; corporate platforms
already employ a variety of systems to amplify and diminish
particular speakers and messages.
Additionally, the antidote to the ignorant people who repeat and
amplify propaganda messages is other influencers who respond
with the truth—in the words of one report, we must “make the
truth louder.” Of course, there will always be true believers for
whom no amount of fact-checking or counter speech will
convince; this is not intended for them. Focus instead on
persuading the persuadable.
47. Revisiting Bruce Schneier
Step 7: Deny everything.
When attack attribution relies on secret evidence, it is easy
for the attacker to deny involvement.
Public attribution of information attacks must be
accompanied by convincing evidence. This will be
difficult when attribution involves classified intelligence
information, but there is no alternative.
Trusting the government without evidence, as the NSA’s
Rob Joyce recommended in a 2016 talk, is not enough.
Governments will have to disclose.
48. Revisiting Bruce Schneier
Step 8: Play the long game.
Counterattacks can disrupt the attacker’s ability to maintain
information operations, as U.S. Cyber Command did during
the 2018 midterm elections. The NSA’s new policy of
“persistent engagement” (see the article by, and interview
with, U.S. Cyber Command Commander’s Gen. Paul
Nakasone here) is a strategy to achieve this. Defenders
can play the long game, too. We need to better encourage
people to think for the long term: beyond the next election
cycle or quarterly earnings report.
49. Technical Fixes
Technical Fixes
• Detecting Deep Fakes – videos and AI created
text/headlines (McAfee researchers)
• https://www.youtube.com/watch?v=DGdY-UWOfoo
• Increasing trust of the source media
• What’sApp India, fact check on shared materials, if
doesn’t pass, the platform doesn’t share it
• Verifying if video media has been tampered with
52. What can individuals do?
• Fact Checking
• Use S.I.F.T.
• Killing the negative
• Emotional resilience
• Are you being emotionally triggered?
• Pause before you act
• Understand your own biases & the biases of the sites
you visit
53. Practice S.I.F.T.
The acronym S.I.F.T.:
1. Stop.
2. Investigate the source.
3. Find better coverage.
4. Trace claims, quotes, and media
to the original context.
- Mike Caulfield https://twitter.com/holden
54. Evolving Space
• There are gaps, lots still to discover
• So far, from a security perspective
• Many are looking for technical solutions
• Opportunities for insight from other skillsets & areas of
expertise
55. Explore the Edge: Semester Project
Information Operations (IOs) are psychological operations used to
influence emotions, beliefs and decision making that will benefit an
adversary’s goals. IOs include:
• Disinformation: the spreading of false information or propaganda
• Deep fakes: doctored media where audio and/or video has been
replaced
• Maskirovka: the general term for Russian deception techniques
used in and beyond times of war,
This project focuses on an aspect of information operations, from
identifying operations in the wild, verifying truths, identifying deception,
and/or combating information operations from a social or technological
perspective. Information operations occur on social media platforms
and originate from a variety of countries targeting internal (domestic)
and external (international) audiences.
56. Thank you + Questions
Heather Vescent
The Purple Tornado
@heathervescent
heathervescent@gmail.com
57. References
• Culture Jamming: https://en.wikipedia.org/wiki/Culture_jamming
• Billboard Liberation Group: http://www.billboardliberation.com/
• Deep Fake Voice Hack: https://www.wsj.com/articles/fraudsters-use-ai-to-mimic-ceos-voice-in-unusual-cybercrime-case-
11567157402
• Social Manipulation to gain access to your cell phone account: https://www.youtube.com/watch?v=lc7scxvKQOo
• CEO Fraud: https://medium.com/in-present-tense/solving-ceo-fraud-with-3-new-identity-solutions-1a19daf6832e
• Infodemic: https://infodemic.blog/
• Infodemic: https://twitter.com/infodemicblog
• Corona Virus misinformation: https://onezero.medium.com/the-simplest-way-to-spot-coronavirus-misinformation-on-
social-media-4b7995448071
• CDC Vaccines: https://www.cdc.gov/vaccinesafety/concerns/autism.html
• BBC Goop: https://www.bbc.com/news/health-51312441
• Identifying bots & bot networks: https://twitter.com/josh_emerson
• Identifying bots & bot networks: https://medium.com/@josh_emerson
• Identifying bots & bot networks: https://medium.com/@josh_emerson/disinformation-accounts-or-spam-bb0d04b2b171
• Sherin Mathews and Amanda House: https://www.youtube.com/watch?v=DGdY-UWOfoo
• Bill Hader/Tom Cruise deep fake: https://www.theguardian.com/news/shortcuts/2019/aug/13/danger-deepfakes-viral-
video-bill-hader-tom-cruise
• Senate Report: https://www.intelligence.senate.gov/sites/default/files/documents/Report_Volume1.pdf
• Fiona Hill https://youtu.be/L5gmpdtbWB0?t=298
• Bruce Schneier: https://www.lawfareblog.com/toward-information-operations-kill-chain
• The Oatmeal/Backfire Effect: https://theoatmeal.com/comics/believe_clean
• Confirmation Bias: https://en.wikipedia.org/wiki/Confirmation_bias
• Your inner troll: https://news.stanford.edu/2017/02/06/stanford-research-shows-anyone-can-become-internet-troll/
• Anyone can be a troll: https://files.clr3.com/papers/2017_anyone.pdf
• Twitter’s synthetic and manipulated media: https://help.twitter.com/en/rules-and-policies/manipulated-media
• Verge article: https://www.theverge.com/2020/3/8/21170714/twitter-manipulated-media-biden-video-retweeted-trump
• Bad is stronger than good: https://www.researchgate.net/publication/46608952_Bad_Is_Stronger_than_Good
• Source: https://onezero.medium.com/the-simplest-way-to-spot-coronavirus-misinformation-on-social-media-
4b7995448071
• SIFT: https://infodemic.blog/