2. Text Books & References
TEXT BOOKS:
T1: Information Security Management – A student’s Hand Book – NASCOMM
T2: Assessing Information Security (strategies, tactics, logic and framework) by A
Vladimirov, K.Gavrilenko, and A.Michajlowski.
T3: “The Art of Computer Virus Research and Defense by Peter Szor.”
WEB REFERENCES:
ü https://www.sans.org/reading-room/whitepapers/threats/implementing-
vulnerability-management-process-34180
ü http://csrc.nist.gov/publications/nistpubs/800-40-Ver2/SP800-40v2.pdf
ü http://www.iso.org/iso/home/standards/management-
standards/iso27001.htm
ü http://csrc.nist.gov/publications/nistpubs/800-55-Rev1/SP800-55-rev1.pdf
3. Information Security Audit
An information security audit is an audit on the level of information security in an
organization.
Within the broad scope of auditing information security there are multiple types of
audits, multiple objectives for different audits, etc.
Most commonly the controls being audited can be categorized to technical, physical
and administrative.
Auditing information security covers topics from auditing the physical security of
data centers to auditing the logical security of databases and highlights key
components to look for and different methods for auditing these areas.
4. 6.1. Introduction - Security Metrics
6.2. Types of Security Metrics
6.3. Using Security Metrics
6.4. Developing the Metrics Process
6.5. Metrics and Reporting
6.6. Designing Information Security Measuring Systems
Lecture Plan
5. 1. Introduction- Security Metrics
❏In the face of
❏ High-profile news reports of serious security breaches
Managers are more than ever being held accountable for
demonstrating effectiveness of their security programs.
6. Why do we care?
How do we know how “secure” an organization is?
Metrics help define “secure”
Metrics let us benchmark our security investments against other organizations
The metrics “gathering” process often leads to identification of security
inconsistencies or holes
7. Why do we care: Example
Manager asks, “Are we secure?”
Without metrics:
“Well that depends on how you look at it.”
With metrics:
“No doubt about it. Look at our risk score before we implemented that firewall
project. It’s down 10 points. We are definitely more secure today than we were
before.”
8. Why do we care: Example
Manager Asks: “Have the changes that we implemented improved our security
posture?”
Without metrics:
“Sure. They must have, right?”
With metrics:
“Absolutely. Look at our risk score before we made the recommended
changes, and now it’s down 25 points. No question, the changes reduced our
security risk.”
9. Motorola CISO on Metrics
“Security experts can't measure their success without security metrics, and
what can't be measured can't be effectively managed.”
(William Boni, President CISO, Motorola Inc. www.secmet.org)
10. What means should managers be using to meet this challenge??
❏Key among these should be security metrics.
❏It helps to understand what metrics are by drawing a distinction between
metrics and measurements.
11. Difference Between Metrics and Measurements
Measurements Metrics
❏ Measurements provide single-point-
in-time views of specific, discrete
factors.
❏ These are generated by counting.
❏ These are raw data.
❏ Metrics are derived by comparing to a
predetermined baseline of two or
more measurements taken over time.
❏ These are generated from analysis.
❏ These are either objective or
subjective human interpretations of
those data.
12. Good Metrics are those that are…… SMART
Specific
Measurable
Attainable
Repeatable
Time Dependent
13. Indicate the degree to which security goals, such as data confidentiality
are being met
and
they drive actions taken to improve an organization’s overall security
program.
Truly useful metrics
14. Common Issues in generating metrics
The number of security attacks an organization has experienced is not
necessarily an indication of how secure that organization is. A security
manager needs to look beyond the organization’s security incident record for
indicators of security strength.
15. Critical elements in generating metrics
Asset value, threat and vulnerability are critical elements of overall risk and
are weighed in most decisions having to do with security
16. How to measure them?
Asset value--- easiest to measure!! But, can we measure the reputation of a
company?
Threat Cannot be measured..
Vulnerability Centre for information security has established benchmarks
and developed automated tools to measure vulnerabilities.
17. Categorizing metrics- By NIST
❏NIST- National Institute for Standards and Technology
❏The Performance Measurement Guide for Information Security divided
security metrics into three categories and links each to levels of security
program maturity.
21. IMPACT
Metrics used to convey the impact of the information security program on the
institution's mission, often through quantifying cost avoidance or risk
reduction produced by the overall security program.
22. 2. TYPES OF SECURITY METRICS
According to Level, three distinct types of metrics
❏Strategic Security Metrics
❏Security Management Metrics
❏Operational Security Metrics
23. Strategic Security Metrics
❏These are measures, concerning the information security elements of
high level business goals, objectives and strategies.
24. Strategic Security Metrics- Example
If the organization needs to bolster
its information security capabilities and competences
in order to support various business initiatives,
without expanding the budget, metrics concerning the efficiency and
effectiveness of information security are probably relevant.
25. ❏Broad-brush metrics relating to information security risks, capabilities and
value tend to exist at Strategic security metrics level.
❏The reporting period may be one or more years.
26. Security Management Metrics
There are numerous facets to managing information security risks that could
be measured, hence many possible metrics.
27. Special effort to identify....
❏They directly relate to achieving specific business objectives for
information security.
❏Which are needed to manage the information security department.
❏Function or team like any other part of the business.
❏Example: Expenditure against budget
29. Examples
❏Metrics concerning information security management system.
Example: implementing dual-factor authentication
❏The information security management system
Example: security incident statistics
30. Operational Security Metrics
❏Most information security controls, system and processes need to be
measured in order to operate and control them.
❏Normally these metrics are only of the direct concern to “managing and
performing security activities.”
31. Managing and performing security activities include....
❏Technical security metrics
❏Non-technical security metrics
That are often updated weekly, daily or hourly basis.
32. Classification By Object of measurement
❏Process Security Metrics
❏Network Security Metrics
❏Software Security Metrics
❏People Security Metrics
33. Process Security Metrics::
❏Measures processes and procedures
❏Usually Compliance/Governance driven
❏Generally support better security, but the actual impact is hard to define.
34. Process Security Metrics- Examples
❏Number of policy violations
❏Percentage of systems with formal risk assessments
❏Percentage of system with tested security controls
❏Percentage of weak passwords(noncompliant )
❏Number of identified risks and their severity
❏Percentage of systems with contingency plans etc….
35. Network Security Metrics::
❏These are driven by products (firewalls, IDS, etc.) Readily available and
widely used, they give a sense of control.
❏Usually have a level of data presentation through charts and interfaces.
❏They can be misleading though.
36. Network Security Metrics- Examples
❏Successful/Unsuccessful logons
❏Number of incidents
❏Number of viruses blocked
❏Number of patches applied
❏Number of spam blocked
❏Number of virus infections
❏Number of port probes
❏Traffic analysis etc….
37. Software Security Metrics::
❏These are usually troublesome(LOC, FPs,Complexity, etc..) Metrics that are
context sensitive and environment-dependent and architecture
dependent.
38. Software Security Metrics-Examples
❏Size and complexity
❏defects/LOC
❏defects (severity, type) over time
❏cost per defect
❏attack surface (# of interfaces)
❏layers of security and design flaws
39. People Security Metrics::
❏Usually relevant, but unreliable.(As people behavior is difficult to model).
❏These are biases and non-standard responses that make it difficult to
predict.
41. Sample list of Metrics- Business Functions
❏ APPLICATION SECURITY
❏ Number of Applications
❏ Percentage of Critical Applications
❏ Risk Assessment Coverage
❏ Security Testing Coverage
42. ❏ CONFIGURATION CHANGE MANAGEMENT
❏ Mean-Time to Complete Changes
❏ Percent of Changes with Security Review
❏ Percent of Changes with Security Exceptions
43. ❏ FINANCIAL
❏ Information Security Budget as % of IT Budget
❏ Information Security Budget Allocation
44. ❏ INCIDENT MANAGEMENT
❏ Mean-Time to Incident Discovery
❏ Incident Rate
❏ Percentage of Incidents Detected by Internal Controls
❏ Mean-Time Between Security Incidents
❏ Mean-Time to Recovery
46. ❏ VULNERABILITY MANAGEMENT
❏ Vulnerability Scan Coverage
❏ Percent of Systems Without Known Severe Vulnerabilities
❏ Mean-Time to Mitigate Vulnerabilities
❏ Number of Known Vulnerability Instances
47. 3. Using Security Metrics
❏It involves data acquisition.
❏Automated or manually collected.
❏ Data collection automation depends on the availability of data from
automated sources versus the availability of data from people.
❏ Manual data collection involves developing questionnaires and
conducting interviews and surveys with the organization’s staff.
48. ❏More useful data becomes available from
❏ Semi-automated
and
❏ Automated data sources
Such as self-assessment tools, certification and accreditation (C&A) databases,
incident reporting and response databases, and other data sources as a
security program matures.
49. ❏Metrics data collection is fully automated when all data is gathered by
using automated data sources without human involvement or
intervention.
50. 4. Development of the Metrics Process
❏Regardless of the underlying framework, Seven Key steps to guide the
process of establishing a security metrics program
51. 1. Define the metrics program goals and objectives
2. Decide which metrics to generate
3. Develop strategies for generating the metrics
4. Establish benchmarks and targets
5. Determine how the metrics will be reported
6. Create an action plan and act on it
7. Establish a formal program review/refinement cycle
52. This seven-step methodology should yield a firm understanding of the
purpose of the security metrics program, its specific deliverables, and how, by
whom, and when these deliverables will be provided.
53. Step 1
Define the metrics program goals and objectives
❏Developing and maintaining a security metrics program could take
considerable effort and divert resources away from other security
activities, it is critical that the goals and objectives of the program be well-
defined and agreed upon up front.
Why??
54. How??
❏A single goal that clearly states the end toward which all measurement
and metrics gathering efforts should be directed is a good approach
55. Provide metrics that clearly and simply communicate how efficiently and
effectively our company is balancing security risks and preventive measures,
so that investments in our security program can be appropriately sized and
targeted to meet our overall security objectives.
56. A few objectives for the goal above, for example, might be:
a) To base the security metrics program on process improvement best
practices within our company.
b) To leverage any relevant measurements currently being collected.
c) To communicate metrics in formats custom-tailored to various audiences.
d) To involve stakeholders in determining what metrics to produce.
57. Step 2
Decide which metrics to generate
❏To dictate what metrics are needed.
❏To determine which metrics could be used,
❏ Top-down approach
❏ Bottom-up approach
58. Top-Down Approach
It starts with the objectives of the security program,
and then works backward to identify specific metrics that would
help determine if those objectives are being met,
and lastly measurements needed to generate those metrics.
60. Bottom-Up Approach
The bottom-up approach entails first defining which security processes,
products, services, etc. are in place that can be or already are measured,
then considering which meaningful metrics could be derived from
those measurements,
and finally assessing how well those metrics link to objectives for the
overall security program.
62. ❏The top-down approach will more readily identify the metrics that should
be in place given the objectives of the overall security program
❏The bottom-up approach yields the most easily obtainable metrics.
63. Both approaches assume that overall security program objectives have
already been established. If they have not been, defining these high-level
objectives is obviously important and a prerequisite.
64. Step 3
Develop Strategies for Generating the Metrics
❏Strategies for collecting needed data and deriving the metrics must be
developed.
❏These strategies should specify
❏ the source of the data
❏ the frequency of data collection
❏ who is responsible for raw data accuracy
❏ data compilation into measurements
❏ generation of the metric.
66. ❏Early on there were few automated tools available to make data
collection, analysis, and reporting cost-effective.
❏But in recent years’ products have been introduced into the marketplace
to make these activities more viable.
67. Step 4
Establish benchmarks and targets
❏Appropriate benchmarks would be identified and improvement targets
set.
❏Benchmarking is the process of comparing one’s own performance and
practices against peers within the industry or noted “best practice”
organizations outside the industry.
68. ❏Not only does this process provide fresh ideas for managing an activity,
but also can provide comparative data needed to make metrics more
meaningful.
❏Benchmarks also help establish achievable targets for driving
improvements in existing practices.
69. Step 5
Determine how the metrics will be reported
❏No security metrics efforts are worthwhile if the results are not effectively
communicated.
70. Step 6
Create an action plan and act on it
❏The action plan should contain all tasks that need to be accomplished to
launch the security metrics program, along with expected completion
dates and assignments.
❏Action items should be directly derivable from the objectives.
71. Step 7
Establish a formal program review/refinement cycle
❏Formal, regular reexamination of the entire security metrics program
should be built into the overall process.
72. Review Process..
❏Is there reason to doubt the accuracy of any of the metrics?
❏Are the metrics useful in determining new courses of action for the
overall security program?
❏How much effort is it taking to generate the metrics?
❏Is the value derived worth that effort?
These and other questions like them will be important to
answer during the review process.
73. A fresh scan of security metrics standards and best practices within and
outside the industry should also be conducted to help identify new
developments and opportunities to fine-tune the program.
74. Conclusion::
❏The seven-step methodology can guide development of very simple
metrics programs, as well as highly ambitious ones.
❏The important thing to keep in mind is that the metrics generated should
be useful enough
❏ to drive improvement in the overall security program and
❏ to help prove the value of that program to the organization as a
whole.
75. 5. Metrics and Reporting
❏The frequency of reports depends on
❏ organizational norms
❏ the volume
❏ gravity of information available and
❏ management requirements.
77. The latter ones are more likely to identify and discuss trends and strategic
issues, and to include status reports on security-relevant development
projects, information security initiatives and so forth, in other words they
provide the context to make sense of the numbers.
78. Here are some options for your consideration
An annual, highly-confidential information security report for the CEO, the
board and other senior management (including internal audit). This report
might include commentary on the success or otherwise of specific security
investments.
A forward-looking section can help to set the scene for planned future
investments, and is a good opportunity to point out the ever changing legal
and regulatory environment and the corresponding personal liabilities on
senior managers.
79. Quarterly status reports to the most senior body directly responsible for
information security, physical security, risk and/or governance.
Traffic light status reports are common and KPIs (Key Performance Indicators)
may be required, but the information security manager’s commentary
(supplemented or endorsed by that of the CTO/CIO) is a good value add.
80. Monthly reports to the CTO/CIO, listing projects participated in and security
incidents, along with their monetary value (the financial impacts do not need
to be precisely accurate, they are used to indicate the scale of losses).
81. 6. Designing information security measurement systems
In order to design an information security measurement system one has to
ask some fundamental questions.
❏What are we going to measure?
❏How will we measure things?
❏How will we report?
❏How should we implement our reporting system?
❏Setting targets.
82. 1.What are we going to measure??
❏Identifying the right metrics, we shouldn’t implement a measurement
process if we don’t intend to follow it routinely and systematically
❏We need repeatable and reliable measures.
❏We shouldn’t capture data that we don’t intend to analyse, that is simply
an avoidable cost.
❏We shouldn’t analyse data if we don’t intend to make practical use of the
results.
83. 2. How will we measure things??
❏Where will the data come from and where will they be stored?
❏If the source information is not already captured and available, there will
be a need to put in place the processes to gather it.
❏This in turn raises the issue of who will capture the data.
❏Will it be centralized or will we distribute the data collection processes?
❏If departments and functions outside central control are reporting, how
far can they be trusted not to manipulate the figures?
❏Will they meet deadlines and formatting requirements?
❏How much data gathering and reporting can be automated?
84. 3. How will we report??
❏What do senior management actually want?
❏To get senior management buy-in it is important to discuss the purpose
and outputs with managers and peers.
❏Provide alternative formats initially to assess their preference.
❏It may be required to report differently from other functions in the
organization, using different presentation formats as well as different
content.
❏Managers are likely to feel more comfortable with conventional
management reports, so look at a range of sample reports to pick out the
style cues.
85. 4. How should we implement our reporting system??
❏When developing metrics, it’s worth testing out the feasibility and
effectiveness of the measurement processes and the usefulness of
chosen metrics on a limited scale before rolling them out across the
entire corporation. Pilot studies or trials are useful ways to iron-out any
glitches in the processes for collecting and analysing metrics, and for
deciding whether the metrics are truly indicative of what you are trying to
measure.
❏Even after the initial trial period, continuous feedback on the metrics can
help to refine the measurement system. Changes in both the organization
and the information security risks it faces mean that some metrics are
likely to become outdated over time.
86. 5. Setting Targets
❏Measuring and reporting leads to the identification and benchmarking of
Key Performance Indicators (KPIs) and then tracking measures to evaluate
performance.
❏Before publishing the chosen metrics it is important to figure out which
ones would truly indicate making progress towards the organization’s
information security goals.
87. What is IT Audit (informal)
• Say what you do
• Do what you say
• Evidence
88. Introduction to Security Audit: Servers and Storage devices,
Infrastructure and Networks, Communication Routes:
1. Information Systems Audit versus Information Security Audit:
• Information Systems Audit : is a large, broad term that encompasses
demarcation of responsibilities, server and equipment management,
problem and incident management, network division, safety, security and
privacy assurance etc. Information systems audit is a broader term that
includes information security audit
• Information Security Audit: has a one-point agenda and that is security of
data and information when it is in the process of storage and transmission.
89. What is an Information Security Audit?
• A security audit is a systematic evaluation of the security of a company's
information system by measuring how well it conforms to a set of
established criteria.
• An audit typically assesses the security of the system's physical
configuration and environment, software, information handling processes,
and user practices.
• Security audits are often used to determine how organizations must deal
with information.
90. Some of the purpose of audits is listed below:
a) Build awareness of current practices and risks
b) Reducing risk, by evaluating, planning and supplementing security efforts
c) Strengthening controls including both automated and human
d) Compliance with customer and regulatory requirements and expectations
e) Building awareness and interaction between technology and business
teams
f) Improving overall IT governance in the organization
91. There are three main types of security diagnostics:
Security Audits
Vulnerability Assessments
Penetration Testing
92. • Security Audits: measure an information system's performance against a
list of criteria.
• Vulnerability Assessment: on the other hand, involves a comprehensive
study of an entire information system, seeking potential security
weaknesses.
• Penetration Testing: is a covert operation, in which a security expert tries
a number of attacks to ascertain whether or not a system could withstand
the same types of attacks from a malicious hacker
93. Scope of the Audit
• As with any Audit, a risk assessment should be one of the first steps to be
completed when examining a new process.
• The risk assessment will help determine whether the process warrants
expending a significant amount of audit resources on the project.
• The scope of the audit depends on the risk.
• But even for the high-risk systems, the scope should be limited to testing
the critical internal controls upon which the security of the process
depends.
94. The scope of the audit depends upon:
a. Site business plan
b. Type of data assets to be protected
c. Value of importance of the data and relative priority
d. Previous security incidents
e. Time available
f. Auditors experience and expertise
95. What should be covered in audits? (Given just for reference only)
96. What makes a good security audit?
IS Auditing Standards by Information Systems Audit and Control Association
(ISACA) A good security audit may likely include the following:
Clearly defined objectives
Coverage of security is comprehensive.
Audit team is experienced, independent and objective(”two-person rule”).
Important IS audit meetings such as the opening and the closing meetings as well as the
interviews should be conducted as a team.
It should be ensured that actual operations in the organization are not significantly
disrupted by the audit when initiating the audit.
Appropriate communication and appointment of central point of contact and other
support for the auditors.
The execution is planned and carried out in a phase wise manner
97. Constraints of a security audit:
Time constraints
Third party access constraints
Business operations continuity constraints
Scope of audit engagement
Technology tools constraints
98. Information Security Methodologies (Black-box, White-box, Grey-box)
Need for a Methodology
• Audits need to be planned and have a certain methodology to cover the
total material risks of an organization.
• A planned methodology is also important as this clarifies the way forward
to all in the organization and the audit teams.
• Which methodology and techniques is used is less important than having
all the participants within the audit approach the subject in the same
manner.
99. Audit methodologies:
Audit methods can be classified according to type of activity. These include
three types
A. Testing – Pen tests and other testing methodologies are used to explore
vulnerabilities. In other words, exercising one or more assessment objects to
compare actual and expected behaviors.
B. Examination and Review – This include reviewing policies, processes, logs,
other documents, practices, briefings, situation handling, etc. In other words
checking, inspecting, reviewing, observing, studying, or analyzing assessment
objects
C. Interviews and Discussion – This involves group discussions, individual
interviews, etc.
100. Auditing techniques:
There are various Auditing techniques used:
• Examination Techniques
• Target Identification and Analysis Techniques
• Target Vulnerability Validation Techniques
101. 1. Examination Techniques
Examination techniques, generally conducted manually to evaluate systems,
applications, networks, policies, and procedures to discover vulnerabilities.
These techniques include:
• Documentation review
• Log review
• Ruleset and system configuration review
• Network sniffing
• File integrity checking
102. 2. Target Identification and Analysis Techniques
Testing techniques generally performed using automated tools used to
identify systems, ports, services, and potential vulnerabilities. They techniques
include
• Network discovery
• Network port and service identification
• Vulnerability scanning
• Wireless scanning
• Application security examination
103. 3. Target Vulnerability Validation Techniques
Testing techniques that corroborate the existence of vulnerabilities, these may
be performed manually or with automated tools. These techniques include
• Password cracking
• Penetration testing
• Social engineering
• Application security testing
104. Security Testing Frameworks:
There are numerous security testing methodologies being used today by
security auditors for technical control assessment.
Four of the most common are as follows:
1. Open Source Security Testing Methodology Manual (OSSTMM)
2. Information Systems Security Assessment Framework (ISSAF)
3. NIST 800 - 115
4. Open Web Application Security Project (OWASP)
105. Audit Process:
A successful audit will minimally:
1. Establish a prioritized list of risks to an organization.
2. Delineate a plan to alleviate those risks.
3. Validate that the risks have been mitigated.
4. Develop an ongoing process to minimize risk.
5. Establish a cycle of reviews to validate the process on a perpetual basis.
106. Every successful audit has common properties:
Define the security perimeter – what is being examined?
Describe the components – and be detailed about it.
Determine threats – what kinds of damage could be done to the systems
Delineate the available tools – what documents and tools are in use or need to be created?
Reporting mechanism – how will you show progress and achieve validation in all areas?
Review history – is there institutional knowledge about existing threats?
Determine Network Access Control list – who really needs access to this?
Prioritize risk – calculate risk as Risk = probability * harm
Delineate mitigation plan – what are the exact steps required to minimize the threats?
Implement procedures – start making changes.
Review results – perform an AAR on the audit process.
Rinse and repeat – schedule the next iteration of the process.
107. Types of assessments that can be performed to test security controls:
• Risk assessment
• Policy assessment
• Social engineering
• Security design review
• Security process review
• Interviews
• Observation
• Document review
• Technical review
108. Testing Security Technology
There are many terms used to describe the technical review of security
controls
Ethical hacking, penetration test, and security testing are often used
interchangeably to describe a process that attempts to validate security
configuration and vulnerabilities by exploiting them in a controlled manner to
gain access to computer systems and networks.
109. There are generally two distinct levels of security testing commonly
performed today:
• Vulnerability assessment
• Penetration test
110. Vulnerability assessment
• This technical assessment is intended to identify as many potential
weaknesses in a host, application, or entire network as possible based on
the scope of the engagement.
• These types of assessments are notorious for finding an enormous amount
of potential problems that require a security expert to prioritize and
validate real issues that need to be addressed.
111. Penetration test
• The penetration test is intended to assess the prevention, detection, and
correction controls of a network by attempting to exploit vulnerabilities and
gain control of systems and services.
• Penetration testers (also known as pentesters) scan for vulnerabilities as part of
the process just like a vulnerability assessment, but the primary difference
between the two is that a pentester also attempts to exploit those
vulnerabilities as a method of validating that there is an exploitable weakness.
• If someone is able to exploit a device without triggering any alarms, then
detective controls need to be strengthened so that the organization can better
monitor for anomalies.
112. Auditors, on the other hand, might not test to that degree and will more than
likely work with a penetration tester or team if a significant level of detailed
knowledge in required for the audit.
113. When performing these types of engagements, four classes of penetration
tests can be conducted and are differentiated by how much prior knowledge
the penetration tester has about the system.
The four types are:
• Red Team/Blue Team Assessment
• White Box
• Black Box
• Grey Box
114. Red and Blue Team Assessment
• Red and Blue Team assessment is like a war game, where the organization
being tested is put to the test in as real a scenario as possible.
• Red Team assessments are intended to show all of the various methods an
attacker can use to gain entry
• It is the most comprehensive of all security tests. This assessment method
tests policy and procedures, detection, incident handling, physical security,
security awareness, and other areas that can be exploited.
• The Red team designate is the attacker and the Blue team is the defence
mechanism builder.
• The two teams sharpen an organisation’s detection and response capability
115. Black Box Testing
• This assumes no prior knowledge of the infrastructure to be tested. The
testers must first determine the location and extent of the systems before
commencing their analysis.
• Black box testing simulates an attack from someone who is unfamiliar with
the system.
• Black box techniques should be used primarily to assess the security of
individual high-risk compiled components; interactions between
components; and interactions between the entire application or application
system with its users, other systems, and the external environment.
• Black box techniques should also be used to determine how effectively an
application or application system can handle threats.
116. White Box Testing
• This provides the testers with complete knowledge of the infrastructure to
be tested, often including network diagrams, source code, and IP
addressing information.
• White box testing simulates what might happen during an "inside job" or
after a "leak" of sensitive information, where the attacker has access to
source code, network layouts, and possibly even some passwords.
• White box techniques involve direct analysis of the application’s source
code, and black box techniques are performed against the application’s
binary executable without source code knowledge.
117. White Box Testing
• Most assessments of custom applications are performed with white box
techniques, since source code is usually available—however, these
techniques cannot detect security defects in interfaces between
components, nor can they identify security problems caused during
compilation, linking, or installation-time configuration of the application.
• White box techniques still tend to be more efficient and cost-effective for
finding security defects in custom applications than black box techniques.
118. Grey box testing
• These are the several variations in between the white and the black box,
where the testers have partial information.
• Penetration tests can also be described as "full disclosure" (white box),
"partial disclosure" (grey box), or "blind" (black box) tests based on the
amount of information provided to the testing party.
119. Phases of Information Security Audit and Strategies
• Pre-audit agreement stage
• Initiation and Planning stage
• Data collection and fieldwork (Test phase)
• Analysis
• Reporting
• Follow-through
120. Ethics of an Information Security Auditor :
Members and ISACA (Information Systems Audit and Control Association) certification
holders shall:
1. Support the implementation of appropriate standards and procedures for the effective
governance and management of enterprise information systems and technology, including:
audit, control, security and risk management.
2. Perform their duties with objectivity, due diligence and professional care, in accordance
with professional standards.
3. Serve in the interest of stakeholders in a lawful manner, while maintaining high standards
of conduct and character, and not discrediting their profession or the Association.
4. Maintain the privacy and confidentiality of information obtained in the course of their
activities unless disclosure is required by legal authority. Such information shall not be used
for personal benefit or released to inappropriate parties.
121. Ethics of an Information Security Auditor :
5. Maintain competency in their respective fields and agree to undertake only
those activities they can reasonably expected to complete with the necessary
skills, knowledge and competence.
6. Inform appropriate parties of the results of work performed including the
disclosure of all significant facts known to them that, if not disclosed, may
distort the reporting of the results.
7. Support the professional education of stakeholders in enhancing their
understanding of the governance and management of enterprise information
systems and technology, including: audit, control, security and risk
management.
122. What Makes an Information Security Auditor?
• At minimum, a bachelor's degree
• Certification is often highly recommended and may be required by some
employers prior to hiring.
• A Certified Information Systems Auditor or CISA is an independent expert
who is qualified to perform information systems audit. This has uplifted the
status of the CISA designation, which is often a mandatory qualification for an
information systems auditor.