Discussion on how to deliver vulnerability management at scale.
Why Fullstack vulnerability management is important and silos of security are an issue. The pitfalls when delivering 1000's of assessments on a continuous basis. How edgescan delivers vulnerability intelligence.
2. Eoin Keary
• CEO/Founder – edgescan.com
• OWASP Global Board Member 2009-2015
• OWASP Project Leader & Contributor
• OWASP Person of the year 2015 & 2016
3. Wolves? Doors? What?
• Wolves = Bad Guys / Hackers
• Doors = Your stuff
• How to protect 1000’s of doors from wolves
continuously and what can go wrong…
4. edgescan- Basis for discussion
• edgescan™ is a sophisticated,
enterprise-grade vulnerability
assessment and management
solution
• edgescan™ helps from small &
medium-sized to large enterprises
identify and remediate known
vulnerabilities
• edgescan™ is a cloud based
SaaS
5. How we get the Statistical model
• 1000’s of vulnerability
assessments globally.
• #Fullstack view of security
• False positive free (99%)
• Industries: Media, Energy,
Government, Pharma,
Finance, Software etc….
7. Web Application Layer (Layer 7)
Lots of high or critical risk issues!!
Easily exploitable
Very Damaging
Very Bad
8. Infrastructure Layer (Non Web app)
Lots of vulnerabilities!!
Not many high or Critical
Risk.
More problems but less
vulnerable
9. Challenge
• Application Layer
(Layer 7) is still more
vulnerable.
• Applications change
more.
• Change results in Risk
• Risk (may) result in
vulnerability & breach.
Risk
Vulnerability
Change
11. Continuous Security
“Keeping up” with development
Assisting secure deployment
Catching bugs early – Push Left
Help ensure “change” is secure
12. How do we manage enterprise
cybersecurity?
• 100’s or 1000’s of web applications
• Perimeter servers
• Cloud environments
• Network Devices, Firewalls, IoT etc
21. Securing 1000 Doors
• Fullstack Vulnerability
Intelligence
– Complete Risk-based view
– Hackers don’t care where the
vulnerability is!
22. #Pitfall - Risk is not Linear
• Low Risk = 1 “point”
• Medium Risk = 5 points
• High Risk = 10 points
But 10 Low Risks != 1 High Risk
23. #Pitfall - Templated Sites
• Vulnerabilities may need to be “Singleton”
– Only one instance reported
– High Volume “Low Risk” issues “break” metrics.
24. Securing 1000 Doors
• Automatic Discovery
– CIDR Range Assessment Not Individual IP’s
• 24, 16 etc etc
– Automatic Detection of new Hosts
– Automatic Assessment of new Hosts
26. Pitfall Explanation Solution
CSRF Tokens Preventing
Crawling
Cross-Site-Request Forgery tokens need to be resent with
every request. If the token is not valid the application may
invalidate the session. Tokens can be embedded in the
HTML and not automatically used by the scanner. This
results in the scanner not crawling or testing the site
adequately.
Using tools which can be configured to
“replay” the appropriate token with the
request.
Not all tools are capable of this. In some
cases multiple tools require to be
“chained” in order to satisfy this
restriction. Macros need to be written.
Tools running a virtual browser.
DOM Security
Vulnerabilities
Client-Side security issues which do not generate HTTP
requests may go undiscovered due to tools only testing the
application via sending and receiving HTTP requests. DOM
(Document Object Model) vulnerabilities may go
undiscovered as the tool does not process client side scripts.
Using tools which can provide virtual
browser capability solves this issue as
dynamic scripts in the browser are
processed and tested by the security tool.
This is also important in relation to
systems built using client-side frameworks
(Angular, Node.js etc) and detects issues
such as DOM XSS. Taint analysis of
JavaScript code is also important to help
discover client-side security issues.
27. Pitfall Explanation Solution
Dynamically Generated
Requests
Contemporary applications may dynamically generate HTTP
requests via JavaScript functions and tools which crawl applications
to establish site maps may not detect such dynamic links and
requests.
Using tools which leverage virtual browsers solve
this problem as the JavaScript is executed as per
a regular users usage of the application. This
results in adequate coverage and detection of
dynamic page elements.
Recursive Links - Limiting
Repetitive Functionality
Applications with recursive links may result in 1000’s of
unnecessary requests. An example of this could be a calendar
control or search result function. This may result in 1000’s of extra
requests being sent to the application with little value to be
yielded.
Example: /Item/5/view , /Item/6/view
Some tools have the ability to limit recursiveness
and depth of requests such that if the tool starts
to crawl a link with 1000’s of permutations of the
same page it will stop the unnecessary resource
and time spent for both the assessment and the
hosting environment to service the assessment.
SSL/TLS Vulnerabilities Many tools which are designed to detect cryptographic issues
simply do it incorrectly. We have worked with some major tool
vendors to assist them with bug fixes in this area.
Using multiple tools to detect the same issue
results in clarity if the issues is present or it’s a
false positive.
Non Standard Protocols Some protocols simply are not handled by certain tools. If
protocols such as Websockets, CORS, AMT, GWTK are not
supported they will not get adequately tested
Using multiple tools in this case helps with
coverage. The tools chosen to deliver the
assessment are based on initial manual
enumeration of the target system.
Insufficient Testing vectors
used
All tools test for defined vulnerabilities using a defined set of
vectors. Other tools also include tests for “known” vulnerabilities.
Using one scanning engine may result in not testing for security
vulnerabilitys adequately due to a restricted list of testing vectors
used.
Leveraging multiple tools to test for particular
vulnerabilities results in more test cases and a
larger set of vectors being sued to test to the
vulnerability.
28. Pitfall Explanation Solution
Non Standard 404 Some sites will use the standard 404 handler, but many have
started to customize them to offer a better user experience.
Custom 404 that response as a 200. This is the simple one,
but many scanners (still) will get caught by this
Using tools which can be configured to
recognise custom errors is important in
order to avoid false positives.
Session Management It is a challenge for any tools stay logged into an application.
The scanner must avoid logout functions, must properly
pass along session tokens wherever they happen to be at
the moment (sometimes cookies, sometimes on the URL,
sometimes in hidden form field) and adjust to multiple
possibilities taking place on a single app.
The scanner must also properly identify when it has lost its
session, and then be able to re-login (requires automated
login process mentioned above) to continue its scan.
Using multiple tools assists with this as not
all tools can be configured reliable to
maintain session state. Not having a reliable
session state or locking out accounts results
in poor coverage and disruption to the
engagement.
Ability to Test Web 2.0
(AJAX), Web Services and
Mobile
Related to a number of pitfalls above; application with
dynamic API calls via JavaScript, Restful requests etc can go
undiscovered and not get invoked at all.
Using multiple tools avoids configured with
REST-awareness can avoid missing area of
the application leaving it untested or
requiring that entire section to tested by
hand.
29. Conclusion
• Fullstack Security is important
• Automation is good but its never
as simple as it looks to get
assessment coverage.
• Additional Thoughts…….