This document provides a summary of Marco Morana's presentation on secure code reviews. The presentation covers what secure code reviews are and are not, why they are needed, methodologies for conducting them, common coding mistakes, and resources for further information. Secure code reviews help ensure compliance, security best practices are followed, and adequate security controls are in place. They should be integrated within the software development lifecycle and involve threat modeling. The methodology includes prioritizing code based on threats, categorizing vulnerabilities, and providing recommendations. Common mistakes include insecure configuration, data protection, authentication, and authorization issues.
1. Secure Code Reviews
Marco Morana
Senior Consultant
Foundstone, A Division of McAfee
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 1
Orlando, Florida
2. Agenda
• Introduction
– What Secure Code Reviews Are Not
– Why We Need Secure Code Reviews ?
– Code Reviews
• Concepts and Strategies
– Secure Code Reviews in the SDLC
– Threat Modeling
– Methodology
– Coding Mistakes
– Tools
• Tips And Tricks
• Resources
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 2
Orlando, Florida
3. Disclaimers
Secure code reviews are not:
1. A stand alone activity separate from the SDLC
2. A process that just relies on tools:
– Managed programming language
– Automated code analysis
3. A method to rate un-attackable code
– Not being scrutinized by security experts
– False sense of security (i.e. false negatives)
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 3
Orlando, Florida
4. Why we need secure code reviews ?
1. Compliance with governing policies
2. Assurance that code follows security best
practices
3. Security assessment before releasing to
QA and production
4. Measurement of adequacy of security
controls to mitigate known threats
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 4
Orlando, Florida
5. Code Reviews
• One to One (peer to peer)
– Part of the sign-off before handing off to QA
– Integrated with the check-in process
• Group (team-driven)
– Advantage of many eye-balls
– Team members take different roles
Both need preparation and organization
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 5
Orlando, Florida
6. Code Reviews - Team Code Review Approach
• Optimal scenario: A team of 4 people in a
conference room with a whiteboard and
projector
• Team Roles
– Lead Reviewer
– Narrator
– Author
– Subject Matter Experts
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 6
Orlando, Florida
7. Agenda
• Introduction
– What Secure Code Reviews Are Not
– Why We Need Secure Code Reviews ?
– Code Reviews
• Concepts and Strategies
– Secure Code Reviews in the SDLC
– Threat Modeling
– Methodology
– Coding Mistakes
– Tools
• Tips And Tricks
• Resources
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 7
Orlando, Florida
8. Secure Code Reviews in the SDLC
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 8
Orlando, Florida
9. Code reviews in the Software Security Life Cycle
The economics of security defects
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 9
Orlando, Florida
10. Agenda
• Introduction
– What Secure Code Reviews Are Not
– Why We Need Secure Code Reviews ?
– Code Reviews
• Concepts and Strategies
– Secure Code Reviews in the SDLC
– Threat Modeling
– Methodology
– Coding Mistakes
– Tools
• Tips And Tricks
• Resources
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 10
Orlando, Florida
11. Methodology – Secure Code Review Process
1. Build a Threat Model
– Identify, evaluate and mitigate risks for the particular
application
2. Build an Attack Plan
– Prioritize threats based on criticality
– Map threats to code artifacts
– Determine which high risk areas to focus the efforts
based upon man-hours and costs
3. Code Review
– Document each vulnerability under bugs or flaws
– Review each section of the code for vulnerability
categories
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 11
Orlando, Florida
12. What Is Threat Modeling?
• Goal: Identify the threats against the system and
the appropriate countermeasures to mitigate the
risk they pose
• Model the system as an attacker will see it:
• Where are the entry points?
• Which assets are targets?
• Recognize the attacker’s advantage and defender’s
dilemma:
• Developers need to get the code 100 % correct, 100% of the
time with limited resources and development time
• Attackers need to find just one hole and can spend as much
time finding it as they want
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 12
Orlando, Florida
13. Methodology - Secure Code Reviews Best Practices
• Have clear goals
– Tactical and strategic scenarios (e.g. new release vs. production)
– Be specific on what must be accomplished
• Decide which analysis style works best
– Depth first vs. breadth first approach
• Prioritize and simplify
– Prioritize based upon critical areas
– Break system complexity
• Be methodical
– Annotate the code you are reviewing (e.g. comments, IDE task
lists)
– Use checklists
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 13
Orlando, Florida
14. Methodology - Secure Code Reviews
• Reduce complexity
– Threat modeling
– Rapid scan
• Review critical sections of the code
– Correlate and annotate
– Use IDE tools (e.g. Visual Studio, Eclipse)
• Categorize security defects
– Threat categorization
– Check lists
– Bugs vs. flaws
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 14
Orlando, Florida
15. Methodology - Security Defects Categorization
Can be categorized as:
• Security Bugs
– An implementation level software security
problem (e.g. buffer overflows, SQL injection)
• Security Flaws
– A design level software security problem (e.g.
an insecure authorization model or data access
layer)
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 15
Orlando, Florida
16. Methodology - Threat Categorization
Un-secure code because of the following threats:
• STRIDE: Spoofing, Tampering, Repudiation,
Information Disclosure, Denial of Service,
and Elevation of Privilege
Secure code by mapping to security controls:
• CIA: Confidentiality, Integrity, Availability
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 16
Orlando, Florida
17. Methodology - Security Frame Categorization
• Configuration Management
– Issues stemming from insecure deployment and administration
• Data Protection in Storage and Transit
– Lack of adequate protection for secrets and other sensitive data
• Authentication
– Lack of strong protocols to verify the identity of a component
outside the trust boundary
• Authorization
– Lack of mechanisms to enforce access controls on protected
resources within the system
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 17
Orlando, Florida
18. Methodology – Security Frame Categorization
• User and Session Management
– Lack of mechanisms to maintain session independence between
multiple logged-on users and insecure user provisioning and de-
provisioning policies
• Data Validation
– Lack of input and output validation when data crosses system or
trust boundaries
• Error handling and Exception Management
– Failure to deal with exceptions effectively and in a secure manner,
resulting unauthorized disclosure of information
• Logging and Auditing
– Failure to maintain detailed and accurate application logs that can
allow for traceability and non-repudiation
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 18
Orlando, Florida
19. Methodology - Secure Code Review Findings
• Sections:
– Bug vs. Flaws
– Threat Categorization
– Risk Rating
– Module and LOC range
– Code Snippet
– Commendation or Recommendation
• Recommendations are often not limited to the
code but also the design and the deployment
environment as well!
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 19
Orlando, Florida
20. Agenda
• Introduction
– What Secure Code Reviews Are Not
– Why We Need Secure Code Reviews ?
– Code Reviews
• Concepts and Strategies
– Secure Code Reviews in the SDLC
– Threat Modeling
– Methodology
– Coding Mistakes
– Tools
• Tips And Tricks
• Resources
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 20
Orlando, Florida
21. Coding Mistakes - Configuration Management
1. # credentials for the application database
2. datasource.name=jdbc_1
3. datasource.url=jdbc:oracle:thin:@dhs:1521:ORA1
4. datasource.classname=oracle.jdbc.driver.OracleDriver
5. datasource.username=scott
6. datasource.password=tiger
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 21
Orlando, Florida
31. Coding Mistakes - Error Handling And
Exception Handling
Error Message:
executeRSProcedure Exception:
Java.sql.SQLException: ORA-
06502:PL/SQL:numeric or value error:
character to number conversion error
Server Name: host1.acme.com
Server Info: IBM WebSphere Application
Server/5.1
Remote Address: 192.168.12.34
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 31
Orlando, Florida
32. Coding Mistakes - Error Handling And Exception
Handling
• “The password is invalid for the
account”
• “The username does not exist”
• “The DOB you entered is invalid”
• “Your account has been locked due to too
many invalid attempts”
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 32
Orlando, Florida
34. Agenda
• Introduction
– What Secure Code Reviews Are Not
– Why We Need Secure Code Reviews ?
– Code Reviews
• Concepts and Strategies
– Secure Code Reviews in the SDLC
– Threat Modeling
– Methodology
– Coding Mistakes
– Tools
• Tips And Tricks
• Resources
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 34
Orlando, Florida
35. Tools - Tools for Static Code Analysis
Advantages:
• Perform preliminary scanning of large code
sets in little time
• Provide consistent results
• Can be used as secure code check-in gateway
• Identify common coding bugs (low hanging
fruits)
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 35
Orlando, Florida
36. Tools - Tools for Static Code Analysis
Common bugs identified by static parsers:
• Un-secure functions
• Lack of proper input validation and output
filtering
• Weak crypto algorithms
• Exception handling errors
• Hard coded passwords, keys, connection strings
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 36
Orlando, Florida
37. Tools - Tools for Static Code Analysis
Disadvantages:
• Do not identify security flaws
• Generate a large amount of false positives
• Provide a false sense of security
Examples:
• ITS4
• RATS
• FlawFinder
• CodeAssure
• PreFIX/PreFAST
• Foundstone CodeScout
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 37
Orlando, Florida
38. Tools - Tools for Dynamic Analysis
Advantages:
• Integrate with Debuggers and IDE
• Monitor Access to Resources (Files, Libraries, Data,
Registry Keys)
• Monitor Network Access
• Help Identify Data Flows
Examples:
• CLR Profiler
• NProf
• Sysinternals Tools – FileMon, RegMon
• Foundstone .NETMon
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 38
Orlando, Florida
39. Tips And Tricks
1. Have a plan
– Focus on clear objectives
– Organize the team
– Review incrementally
2. Follow a methodology
– Identify threats and countermeasures
– Use vulnerability check lists and tools
– Categorize security defects
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 39
Orlando, Florida
40. Tips And Tricks
3. Integrate With Other Activities in the S-SDLC
– Information risk management
– Metrics and measurements
– Training and awareness
4. Revise the Plan and the Process
– Threats and vulnerabilities
– New techniques
– People, process and technology
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 40
Orlando, Florida
41. Questions ?
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 41
Orlando, Florida
42. Resources
• Software Security Code Review: Code Inspection
Finds Problems, R. Araujo and M. Curphey
– http://www.softwaremag.com
• A Process for Performing Security Code Reviews,
M. Howard
– http://www.computer.org
• How To: Perform a Security Code Review for
Managed Code, Microsoft Patterns & Practices
– http://msdn.microsoft.com
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 42
Orlando, Florida
43. Contact Information
• Presenter Email:
– marco.morana@foundstone.com
• Foundstone Software Application Security
Services (SASS)
– www.foundstone.com/sass
• Foundstone Training
– www.foundstone.com/education
Marco Morana
Secure Code Reviews
33rd CSI Conference, DEV-7 November 7th, 2006 43
Orlando, Florida