In this webinar session, we share a comprehensive list of peer code review best practices, distilled down years of SmartBear research and case studies. At the end, we shared how our code and document review tool, Collaborator, can help teams put these tactics into practice.
10. 10
Code Review is the #1 Way to Improve Code Quality
According to: The 2018 State of Code Review
The 2017 State of Code Review
The 2016 State of Code Review
15. • Formal Inspections
Types of Code Review Planning
Introductory Meeting
Inspection Meeting
Rework
Verification Meeting
Complete
Follow-Up Meeting
Readers and reviewers
inspect the code privately.
If no defects are found,
review is complete.
If more defects are
found, back to rework.
21. • Conduct Reviews Daily
What Works Best:
The 2018 State of Code Review
22. • Conduct Reviews Daily
• Set Clear Expectations
The 2018 State of Code Review
What Works Best:
23. • Conduct Reviews Daily
• Set Clear Expectations
*Utilize a checklist of things to
check for at the class, method,
& class-hierarchy levels
Research by Alastair Dunsmoore, Marc Roper, and Murray Wood. (2000)
The Best Kept Secrets of Peer Code Review
What Works Best:
24. • Conduct Reviews Daily
• Set Clear Expectations
• Go Tool-Assisted
The 2018 State of Code Review
What Works Best:
25. • Conduct Reviews Daily
• Set Clear Expectations
• Go Tool-Assisted + Meetings
We never perform meeting-based reviews...
28% Satisfied with Code Quality
48% Unsatisfied with Code Quality
The 2018 State of Code Review
What Works Best:
26. • Conduct Reviews Daily
• Set Clear Expectations
• Go Tool-Assisted + Meetings
Research by Lawrence Votta from AT&T Bell Laboratories. (1993)
Supporting research by Diane Kelly and Terry Shepard. (2003)
Supporting research by Reidar Conradi. (2003)
The Best Kept Secrets of Peer Code Review
What Works Best:
27. • Conduct Reviews Daily
• Set Clear Expectations
• Go Tool-Assisted + Meetings
• Report on Their Process
The 2018 State of Code Review
What Works Best:
28. • Conduct Reviews Daily
• Set Clear Expectations
• Go Tool-Assisted + Meetings
• Report on Their Process
• Foster a Learning Culture
What Works Best:
30. Review Guidelines
• Limit reviews to 60 minute
sessions at most
30
Research by Alastair Dunsmoore, Marc Roper, and Murray Wood. (2000)
The Best Kept Secrets of Peer Code Review
31. Review Guidelines
• Limit reviews to 60 minute
sessions at most
• Consider shorter function
length to improve readability
3131
Research by Uwano at Nara Institute of Science and Technology. (2006)
The Best Kept Secrets of Peer Code Review
32. Review Guidelines
• Limit reviews to 60 minute
sessions at most
• Consider limiting function
length to improve readability
3232
Research by Uwano at Nara Institute of Science and Technology. (2006)
The Best Kept Secrets of Peer Code Review
33. Review Guidelines
• Limit reviews to 60 minute
sessions at most
• Consider shorter function
length to improve readability
3333
Research by Uwano at Nara Institute of Science and Technology. (2006)
The Best Kept Secrets of Peer Code Review
34. Review Guidelines
• Limit reviews to 60 minute
sessions at most
• Consider shorter function
length to improve readability
• Slow down your reading and
take time to scan
3434
Research by Uwano at Nara Institute of Science and Technology. (2006)
The Best Kept Secrets of Peer Code Review
35. Ideal Review
• Author annotates their code
prior to reviewers reading
35
Cisco Case Study with SmartBear. (2006)
The Best Kept Secrets of Peer Code Review
36. Ideal Review
• Author annotates their code
prior to reviewers reading
• 100 – 300 lines of code
36
Cisco Case Study with SmartBear. (2006)
The Best Kept Secrets of Peer Code Review
37. Ideal Review
• Author annotates their code
prior to reviewers reading
• 100 – 300 lines of code
• Reviewers spend 30-60 minutes
reviewing the code change
37
Cisco Case Study with SmartBear. (2006)
The Best Kept Secrets of Peer Code Review
38. Goals & Reading Metrics
• Internal Metrics
• Inspection Rate
• Defect Rate
• Defect Density
• Defect Count
• Defect Type and Severity
• External Metrics
• Number of support calls
• Number of bugs injected by Development
Proprietary & Confidential
38
40. Reviewing Artifacts
90% of teams are conducting document review of some kind.
35% of respondents are using a tool to do this.
Teams using a tool for reviewing documents are
26% more likely
to be satisfied with their code review process.
40
The 2018 State of Code Review
41. Review Best Practices
• Conduct reviews daily
• Set clear expectations with checklists
• Conduct tool-assisted reviews + meetings
• Create goals and report on your process
• Foster a learning culture
• Limit reviews to 60 minute sessions
• Shorten function length for readability
• Slow down your reading and take time to scan
• Author annotates code prior to adding reviewers
• 100 – 300 lines of code at a time
• Reviewers should spend 30-60 minutes at a time
• Verify that defects have been addressed
• Conduct peer reviews on other docs & artifacts
• Maintain review history for long-term success
41
44. Review Best Practices
• Conduct reviews daily
• Set clear expectations with checklists
• Conduct tool-assisted reviews + meetings
• Create goals and report on your process
• Foster a learning culture
• Limit reviews to 60 minute sessions
• Shorten function length for readability
• Slow down your reading and take time to scan
• Author annotates code prior to adding reviewers
• 100 – 300 lines of code at a time
• Reviewers should spend 30-60 minutes at a time
• Verify that defects have been addressed
• Conduct peer reviews on other docs & artifacts
• Maintain review history for long-term success
44
45. Review Best Practices
45
• Conduct reviews daily
• Set clear expectations with checklists
• Conduct tool-assisted reviews + meetings
• Create goals and report on your process
• Foster a learning culture
• Limit reviews to 60 minute sessions
• Shorten function length for readability
• Slow down your reading and take time to scan
• Author annotates code prior to adding reviewers
• 100 – 300 lines of code at a time
• Reviewers should spend 30-60 minutes at a time
• Verify that defects have been addressed
• Conduct peer reviews on other docs & artifacts
• Maintain review history for long-term success
47. Review Best Practices
• Conduct reviews daily
• Set clear expectations with checklists
• Conduct tool-assisted reviews + meetings
• Create goals and report on your process
• Foster a learning culture
• Limit reviews to 60 minute sessions
• Shorten function length for readability
• Slow down your reading and take time to scan
• Author annotates code prior to adding reviewers
• 100 – 300 lines of code at a time
• Reviewers should spend 30-60 minutes at a time
• Verify that defects have been addressed
• Conduct peer reviews on other docs & artifacts
• Maintain review history for long-term success
47
48. Smart Review Practices
• Set clear expectations with checklists
• Create goals and report on your process
• Author annotates code prior to adding reviewers
• Verify that defects have been addressed
• Conduct peer reviews on other docs & artifacts
• Maintain review history for long-term success
48
Teams that report on their code
reviews are 5X as likely to be
satisfied with their process
Teams that review documents in
tool are 26% more likely to be
satisfied with their code reviews.
49. Review Best Practices
49
• Conduct reviews daily
• Set clear expectations with checklists
• Conduct tool-assisted reviews + meetings
• Create goals and report on your process
• Foster a learning culture
• Limit reviews to 60 minute sessions
• Shorten function length for readability
• Slow down your reading and take time to scan
• Author annotates code prior to adding reviewers
• 100 – 300 lines of code at a time
• Reviewers should spend 30-60 minutes at a time
• Verify that defects have been addressed
• Conduct peer reviews on other docs & artifacts
• Maintain review history for long-term success
SmartBear tools are synonymous with quality across the entire SDLC, from planning and design to testing and monitoring. We were founded in 2009 in Boston, Mass and have grown to over 11 global offices, encompassing North America, Europe, Asia, and Australia.
Over 6 million developers, testers, and operations professionals from 20,000 plus companies, like Google, RBC, and Vineyard Vines, use our tools ever day to deliver bug free software.
Our company of over 500 employees also provides 5 free tools for the community, including 2 wildly popular open source tools Swagger and SoapUI.
Our portfolio spans 6 different categories.
Plan - with HipTest and SwaggerHub
Create – with ServiceV Pro, Collaborator, and TestLeft
Test – with SoapUI Pro, LoadUI Pro, TestComplete, LoadNinja, and CBT
Deploy – we partner here with popular solutions like AWS and Docker
Monitor – with AlertSite
and finally Manage and Analyze with Zephyr.