DevOps puts an intense focus on automation – taking humans out of the loop whenever possible to allow frequent, incremental updates to production systems. However, thorough application testing often has multiple components – much of this can be automated, but manual testing is also required. This is inconvenient and not “DevOps-y,” but is unfortunately an unavoidable requirement in the real world. In addition, managing these multiple sources of application vulnerability intelligence often requires manual interaction – to clear false positives, de-duplicate repeated results, and make decisions about triage and remediation.
Axway has rolled out an application security program that incorporates automated static and dynamic testing, attack surface analysis, component analysis, as well as inputs from 3rd parties including manual penetration testing, automated and manual dynamic testing, automated and manual static testing, and test results from vendors providing test data on their products. Automation has allowed Axway to increase the frequency of web application testing, thus reducing the cycle time in the application vulnerability “OODA loop.” Moving beyond the identification of vulnerabilities, Axway has deployed ThreadFix to automatically aggregate the results of the automated testing and de-duplicate findings. 3rd party penetration testers are also finding vulnerabilities and reporting them in reasonably structured CSV files requiring Axway to convert this manual test data and incorporate it into the aggregated vulnerability model in ThreadFix. Centralizing this pipeline allows for metric tracking – both for the application security program as a whole as well as on a per-vulnerability-source basis. This automation and consolidation now covers 50% of Axway’s application vulnerability review process - with plans to extend further.
This presentation walks through Axway’s construction of their application security-testing pipeline and the decisions they were forced to make along the way to best maximize the use of automation while accommodating the reality of manual testing requirements. It then looks at how this testing regimen and the associated automation have allowed them to impact deployment practices as well as collect metrics on their assurance program. Finally, it looks at lessons learned along the way – the good and the bad – and identifies targeted next steps Axway plans to take to increase the depth and frequency of application security testing while dealing with the deployment realities placed on them to remain agile and responsive to business requirements.
1. Blending
Automated
and
Manual
Testing
Making
Application
Vulnerability
Management
Pay
Dividends
2. My
Background
• Dan
Cornell,
founder
and
CTO
of
Denim
Group
• Software
developer
by
background
(Java,
.NET,
etc)
• OWASP
San
Antonio
@danielcornell
3. My
Background
• Steve
Springett,
Application
Security
Architect
for
Axway
• Software
developer
by
background
• Leader
of
OWASP
Dependency-‐Track
• Contributor
to
OWASP
Dependency-‐Check
@stevespringett
5. Standardization
• All
projects
use
same
build
system
• All
projects
built
the
same
way
• Automated
onboarding
for
new
projects
• Per-‐project
build
expertise
not
required
12. 12
ThreadFix
Accelerate
Software
Remediation
ThreadFix
is
a
software
vulnerability
aggregation
and
management
system
that
helps
organizations
aggregate
vulnerability
data,
generate
virtual
patches,
and
interact
with
software
defect
tracking
systems.
13. ThreadFix
• Open
Source
(MPL)
application
vulnerability
management
platform
• Create
a
consolidated
view
of
your
applications
and
vulnerabilities
• Prioritize
application
risk
decisions
based
on
data
• Translate
vulnerabilities
to
developers
in
the
tools
they
are
already
using
14. ThreadFix Community
Edition
• Main
ThreadFix website:
www.threadfix.org
– General
information,
downloads
• ThreadFix GitHub site:
www.github.com/denimgroup/threadfix
– Code,
issue
tracking
• ThreadFix GitHub wiki:
https://github.com/denimgroup/threadfix/wiki
– Project
documentation
• ThreadFix Google
Group:
https://groups.google.com/forum/?fromgroups#!forum/threadfix
– Community
support,
general
discussion
16. Access
to
Vulnerability
Data
• Tradeoffs
– The
more
places
the
vulnerability
data
lives,
the
more
likely
a
compromise
– Withholding
information
from
people
who
need
it
makes
remediation
more
challenging
17. Managing
All
Vulnerability
Data
• Manual
activities
– Penetration
Testing
– Code
Reviews
• 3rd Party
Data
Sources
– Customer-‐performed
Testing
– External
auditor-‐performed
Results
18. SSVL
and
Manual
Results
• SSVL
Data
Format:
– https://github.com/owasp/ssvl
• SSVL
Conversion
Tool:
– https://github.com/denimgroup/threadfix/wiki/SSVL-‐Converter
19. RESTful API
to
Vulnerability
Data
Custom
R&D
Monitoring
Dashboard
Custom
Dashboards
20. Key
Performance
Indicators
• Don’t
go
overboard
– Use
only
what
is
needed
• Progress
and
velocity
• Per
team
comparison
• Min/max/avg time
to
close
per
severity
• By
CWE
22. Lessons
Learned
• Provide
as
much
visibility
as
possible
– Varying
degrees
of
detail
– Multiple
delivery
vehicles
• Set
clear
pass/fail
criteria
for
Security
Bars
– Provide
custom
dashboard
to
provide
status
and
advanced
warning
23. Additional
Advice
• Automation
is
not
better
than
manual
– It’s
faster
and
more
efficient
– Both
are
necessary
• Don’t
forget
manual
assessments
– Threat
Modeling
– Secure
Design/Architecture
and
Code
Review
– Penetration
Testing
24. Finally
• Vulnerabilities
in
CI
/
CD
/
CS
Infrastructure
– Threat
Model
– Secure
Architecture
Review
– Patch
Management
– Configuration
Management
– Key
Management
– Always
use
TLS