"Cutting through the FUD of the Anti-Malware industry to expose some home truths about the real world of independent testing, the state of the threat and the solutions that actually work."
BSides London 2016 Presentation #BSidesLDN2016
2. MalwareMyths.com @carlgottlieb
Carl Gottlieb
• Technical Director, Cognition (InfoSec VAR)
• Podcast Host
• 15 years as InfoSec tech consultant
• The Malware Riddle
• Vendors claim to be great
• Testers confirm it
• Most orgs use multi layers of best-of-breed tech
• But Malware infections are growing in impact
• What’s going on?
08/06/2016 - Copyright Cognition Secure Ltd 2016 - PUBLIC
3. MalwareMyths.com @carlgottlieb
Myth #1 – You Don’t Need AV
Myth – You Don’t Need AV
• AV is pointless – too easy to bypass
• You’re an OPSEC god
• You’ve got a Mac
• AV increases the threat surface
(You’re a smart arse)
Reality – You Do Need AV
• The Real World:
• Next-gen AV is seriously good
• We all make mistakes
• Macs have joined the malware
party too
• The pros outweigh the cons.
End of.
08/06/2016 - Copyright Cognition Secure Ltd 2016 - PUBLIC
4. MalwareMyths.com @carlgottlieb
Myth #2 – AV is Dead
Myth – AV is Dead
1. “AV Is Dead”, “Prevention is Dead”
2. You can’t rely on static analysis
3. 0-Days require dynamic analysis
4. You Need Pretence Defence in
Depth
5. You Need to Buy More Stuff
Reality – AV is alive, but very sick
1. We all still need AV and we all still
buy it
2. Vendors are doing a bad job
3. AV is cheap for a reason
08/06/2016 - Copyright Cognition Secure Ltd 2016 - PUBLIC
5. MalwareMyths.com @carlgottlieb
Myth #3 – “Next-Gen” AV is Snake Oil
Myth - “Next-Gen” AV is Snake Oil
• New tech isn’t perfect – so stick with
what you know
• It can’t possibly work
• It’s just VirusTotal in an engine
• Next gen is BS
• Detection relies on internet access.
• Complexity = effectiveness.
Reality - “Next-Gen” is AV that works
• You can do a lot better
• Next-Gen vendors don’t use VT
• Offline detection is a thing
• The best products are simple
• Behavioural analysis is risky and
flawed
• You can have cheap or you can have
good
08/06/2016 - Copyright Cognition Secure Ltd 2016 - PUBLIC
6. MalwareMyths.com @carlgottlieb
Myth #4 - Trust the Experts
Myth – Trust the Experts
• We’re better because…
• We scored 100% in test XYZ
• Never Pay the Ransom
• False +ve worse than false –ve
• APT APT APT
Reality – Trust No One
• Vendors know very little about their
competitors
• Many SE’s can’t touch malware
• Everyone is biased
• Gang rivalry is fierce
• 0-Day != APT
• Focus on your threats and risks
08/06/2016 - Copyright Cognition Secure Ltd 2016 - PUBLIC
7. MalwareMyths.com @carlgottlieb
Myth #5 – Trust the Testers
Myth – Trust the Testers
• Intendent Testing is the answer
• Real World Testing
• Random samples
Reality – Trust No One
• Be Sceptical
• Follow the money
• Real World is anything but
• Don’t trust anyone
• Test products yourself for your org
08/06/2016 - Copyright Cognition Secure Ltd 2016 - PUBLIC
8. MalwareMyths.com @carlgottlieb
Testing - Signs of Dodginess
• Motivation and Finance -> Bias
• Personal Opinions – e.g. “It looks like they try to avoid getting tested in order to continue to attract users simple by
unproven marketing claims.” (AV Comparatives)
• Perfect Scores
• Irrelevant Methodologies and Scoring
• Using VirusTotal for Malware Samples
08/06/2016 - Copyright Cognition Secure Ltd 2016 - PUBLIC
9. MalwareMyths.com @carlgottlieb
2016 Examples of Bad Testing Methodology
• Samples
• Tiny sample sizes
• Excluding ransomware
• Excluding custom packed/made malware
• Non-corporate greyware
• Configuration – Using the wrong version of the AV product
• Usability – # false positives as sole measure
• Perf Testing – Dropping 15,000 exe’s into a directory
• Detection
• Testing Mac malware on Windows
• Known malicious URL’s only (what about USB, email, waterholed websites, network shares…?)
• Excluding malicious admin tools, e.g. PSEXEC
08/06/2016 - Copyright Cognition Secure Ltd 2016 - PUBLIC
10. MalwareMyths.com @carlgottlieb
Myth #6 - Virus Total Dropped a Bomb on
Next Gen Vendors
Myth – VT Dropped a Bomb
• VT “cut off” the leechers
• This change broke the “fake” Next-
Gen vendors
Reality – No One Big was Affected
• VT uses the CLI engine only
• Pay $$$ for samples
• Integrate your engine for results
• Vendors have all VT data
• Changes at VT had no impact on
main next-gen vendors
• Nothing has changed. No one big
was affected
08/06/2016 - Copyright Cognition Secure Ltd 2016 - PUBLIC
11. MalwareMyths.com @carlgottlieb
What You Can Do
1. Be sceptical
2. Be optimistic
3. Pause any anti-malware projects/renewals
4. Get a good VAR
5. Get Your Hands Dirty
08/06/2016 - Copyright Cognition Secure Ltd 2016 - PUBLIC
Notes de l'éditeur
Cutting through the FUD of the Anti-Malware industry to expose some home truths about the real world of independent testing, the state of the threat and the solutions that actually work.
Fighting malware sounds like a simple problem.
Is something good or bad? And if it’s bad just block it.
All the anti-malware vendors claim to be great at fixing the malware problem.
The testing organisations confirm it.
Most organisations use some kind of best-of-breed anti-malware technology, in multiple layers, e.g. AV on desktop, AV on email and web gateways, AV on mail servers, sandbox network malware detection (.e.g FireEye/Palo Alto).
But malware infections are impacting us more and more.
What’s Going On?
This presentation cuts through some of the myths in the anti-malware industry that are hindering our attempts to beat malware.
Vendors are struggling. Their products can’t keep up with new malware and they’re starting to miss old malware. So they convince you:
You need more layers of equally flawed technology. Like trying to catch sand in a colander. You can keep adding more colanders but some will eventually slip through.
Keep the price of AV low to retain the customer so that you
Buy other related products from them that work really well like DLP or NAC ;)
And prevents new market entrants from competing on price – always more expensive
And some vendors think it’s so hard to detect in real time that we shouldn’t even try, and instead turn it into a big data problem, trying to analyse data and network behaviour and ask you to hire a SoC analyst to do it
AV isn’t dead, it’s just not very good
Saying AV is dead is like saying vaccines are useless because people still get flu
Most Anti-malware companies hate the notion of “next-gen” – it just makes them look a bit old and crap
There’s a lot of FUD being thrown around and no acceptance of the major flaws of traditional AV approaches (e.g. need for constant updates)
Malware protection is like car air bags, you can have cheap or you can have good. If you get hit you don’t want cheap. Just look at the cost of a ransomware incident. (Often causes a full day outage of all IT systems)
Most people in this industry are good and will naturally accentuate the positives and skip over the negatives of their products
But they don’t know what they don’t know and fixate on misunderstandings of competitors’ technology
But almost everything a vendor says publicly is controlled messaging, said for a reason, e.g. a certain AV vendor stating that victims should “never” pay a ransom and it’s “always a bad idea” because they could never be seen to support the malware bad guys.
The vendors have formed packs that fight against their rivals, e.g. “Traditional AV” vs “Next-Gen AV” vs “Isolation” vs “Detect and Respond” vs “Sandbox”
Focus on the threats and risks that your organisations faces. Ransomware may be a priority for some and state sponsored targeted attacks a priority for others.
Independent Malware testing is deeply flawed in practice.
Testing methodology, scoring and sample selection are not reflective of your organisation.
Funding and motivation will always bias the tests.
There are some great testers but don’t rely on them.
Test products yourself, focusing on your threats and use a methodology appropriate to you organisation.
Check to see who funded the test. Why was it performed? Tests cost money so there’s always a reason behind investing in performing a test.
A tester who focuses on one group of products will always be biased by the feature-set they are testing and by that group of funding. (e.g. How well would a flying car compare in a traditional car group test? The tests themselves are biased towards the features of the majority)
Personal opinions in test results indicate bias.
Perfect scores in tests would suggest perfection in “real life” which no product can ever achieve.
VirusTotal provides all its samples to all the AV companies, so they all should detect the with 100% efficacy.
Many recent tests have used questionable methodology
Rarely do tests look at truly real world scenarios, e.g. handling the 10% of corporate machines with out-of-date AV definitions
Some tests include blocking URL’s, i.e. a web filter test. But must an AV product do this? And if it doesn’t is it right it’ll get 0% false positives in a test?
What about testing from other malware sources such as USB, email and waterholes
For more information, see a full write up of the VT FUD debacle here: https://www.linkedin.com/pulse/av-bomb-never-carl-gottlieb
Be sceptical. Don’t trust any claims, results or analysis.
Be optimistic. Know that very good tech exists to virtually solve the malware problem. We can do better than cheap AV
Pause any anti-malware projects/renewals and look at the field
Get a good VAR (there are a few great ones out there) and get their advice, guidance and support. A good VAR will bring a vmware malware lab to your office and let you play with new malware and various vendor products.
Test out some anti-malware products yourself