Security is often a frustrating field for business and IT decision makers. It can be difficult to quantify, difficult to get visibility, and it’s difficult to know when you have “enough”. Do you really need that latest threat feed subscription or state of the art malware protection device? Do you need to add another security analyst to your team? And if so, how can you understand, in business terms, the value these investments bring to the business? This session will explore practical methods for the application of metrics in security to support business decision making, and provide a framework to implement straightforward security metrics, whether inside your wall or at a service provider.
Automating Google Workspace (GWS) & more with Apps Script
Practical Measures for Measuring Security
1.
2. WELCOME TO SECURE360 2012
Did you remember to scan your badge for CPE
Credits? Ask your Room Volunteer for
assistance.
Please complete the Session Survey front and
back (this is Room 7), and leave on your seat.
Note: “Session” is Tuesday or Wednesday
Are you tweeting? #Sec360
3. AGENDA
Are you Ready?
The Problem of Measuring Security
Metric Myths
Characteristics of Effective Metrics
Defining Your Metrics
The Process of Measurement
Sample Metrics
Implementing Metrics
Presenting Metrics
A Mature Metrics Program
Page 3
4. WHY HAVEN’T YOU SOLVED THIS YET?
Is the Organization ready?
What’s the Tone from the Top?
Is it Security someone’s Job?
Do you have Policy in place?
Are resources allocated to identify and detect issues?
Are resources allocated to remediate issues?
Are you Level 4?
Page 4
6. TYPICAL PROBLEMS OF MEASURING SECURITY
Risk is difficult to define precisely
Attack Surface
Current Environment
Asset Value
Measures not linked to action
Measures often focus on outcomes
Page 6
7. METRIC MYTHS
7 Myths that hold people back 92.467% of the time.
1. Metrics must be Objective and Tangible
2. Metrics must have discrete values
3. Metrics must be absolute
4. Metrics are costly
5. You can’t manage what you can’t measure
6. It’s essential to measure outcomes
7. You need precise, accurate data
Page 7
8. CHARACTERISTICS OF A GOOD METRIC
(This is probably NOT a good example)
Attackability Computation.
An Attack Surface Metric, Carnegie Mellon University, 2005
Page 8
9. CHARACTERISTICS OF A GOOD METRIC
1. Directly Relates to an objective
2. Should have a logical stakeholder
3. Collection should be inexpensive, simple and standardized
4. Should have a resolution appropriate for maturity
5. Should be phase appropriate
6. Should have applicability defined
7. Should have an indicated action
Page 9
11. DEVELOPING YOUR METRICS
Metrics Relating to Security Controls
1. Should map directly to a defined control
2. Use data describing the security control’s
implementation to generate required
measures
3. Characterize the measure as applicable to
system categorization (low, med, high)
Page 11
12. DEVELOPING YOUR METRICS
Metrics Relating to Security Program Performance
1. Map to InfoSec Goals & Objectives that
encompass performance
2. Use the data describing the information
security program performance to generate
required measures
Page 12
13. NOW THAT YOU HAVE YOUR METRICS
On your Mark, get Set…
Document in a standard format
See 800-55 for an excellent template
Prioritize and Select
Establish Performance Targets
Evaluate Metric performance and relevance periodically, incorporate feedback
Page 13
14. SAMPLE METRICS
• Percentage of the agency’s information system budget devoted to
information security
• Percentage of “high” vulnerabilities mitigated within defined time
periods after discovery
• Percentage of remote access points used to gain unauthorized access
• Percentage of information system security personnel that have
received security training
• Average frequency of audit records review and analysis for
inappropriate activity
Page 14
15. SAMPLE METRICS (CONTINUED)
• Percentage of new systems that have completed certification and
accreditation (C&A) prior to their implementation
• Percentage approved and implemented configuration changes
identified in the latest automated baseline configuration
• Percentage of information systems that have conducted annual
contingency plan testing
• Percentage of users with access to shared accounts
Page 15
16. SAMPLE METRICS (CONTINUED)
• Percentage of incidents reported within required time frame per
applicable incident category
• Percentage of system components that undergo maintenance in
accordance with formal maintenance schedules
• Percentage of media that passes sanitization procedures
• Percentage of physical security incidents allowing unauthorized
entry into facilities containing information systems
Page 16
17. SAMPLE METRICS (CONTINUED)
• Percentage of employees who are authorized to access information
systems only after they sign an acknowledgement that they have read
and understood rules of behavior
• Percentage of individuals screened before being granted access to
organizational information and information systems
• Percentage of vulnerabilities remediated within organization-
specified time frames
Page 17
18. SAMPLE METRICS (CONTINUED)
• Percentage of system and service acquisition contracts that include
security requirements and/or specifications
• Percentage of mobile devices that meet approved cryptographic
policies
• Percentage of operating system vulnerabilities for which patches
have been applied or that have been otherwise mitigated
Page 18
22. WHEN YOU GET BACK TO THE OFFICE ON
MONDAY:
1. Are you ready?
2. Engage Stakeholders
3. Identify Your Metrics
- Leverage CIS, NIST 800-55
4. Automate collection & reporting
5. Act on what you find
6. Make it look good!
7. Document the value
8. Re-evaluate periodically
Page 22
23. REFERENCES / CREDITS
CMMI: http://www.sei.cmu.edu/cmmi/
http://www.noticebored.com/html/metrics.html
Center for Internet Security Consensus Security Metrics:
http://benchmarks.cisecurity.org/en-us/?route=downloads.metrics
NIST 800-55: http://csrc.nist.gov/publications/nistpubs/800-55-Rev1/SP800-55-rev1.pdf
http://www.geckoboard.com/
Page 23
1. It’s ok to measure subjective factors, such as “security awareness”, as long as you don’t measure subjectively.2. It’s easy to measure the number of people that attended security awareness training, but it’s more difficult to measure the effectiveness of that training. That doesn’t mean it’s impossible or not worthwhile, though. And survey and statistical theory can be applied to extract very useful information, especially when applied to a continuous scale.3. The number of security incidents this month, or the number of vulnerabilities patched are both absolute numbers, but without a good deal of context such as the total number of vulnerabilites, the size of the environment, the number of staff available to patch, etc – it makes it very hard to understand the real meaning behind these numbers. Surveying your staff to ask “Is security better or worse this month versus last month” is probably a much more telling number, measured over team, even though it has not absolute value.4. It can be costly measure, but this should be a function of your metric design. We’ll talk more about this in the next section.5. This is related to all of the previous myths. You can absolutely improve security and reduce risk without being able to measure.6. Just because your house did not burn down this month doesn’t mean that no one piled a stack of oily rage next to the gasoline can in the garage. With real world security, meauring outcomes is like talking about the need to keep the barn door closed after the horses are gone. Outcomes can be useful however, so don’t throw them out completely.7. A recent survey showed that 87.663% of security metrics were a load of hooey. The more decimal places you see, the more suspect you should be.
Show vuln scan resultsSo what?Actions?Resources required?Business case?