14. Amount of Comments
• A crutch for lazy coding
• Misuse of existing systems
• Creates a maintenance burden
• Guaranteed to be incorrect
15. Code Coverage
• Executing code via automated tests
increases the quality of the code
• %age of lines of code executed by
automated tests
16. Code Coverage
1. Codebase has 85% coverage
2. Developer refactors some code and
commits changes
3. Codebase has 82% coverage, build fails
because threshold is 85%
What went wrong?
17. Code Coverage
• Codebase has 100% coverage
• Developer makes daily releases to fix
daily defect reports
What’s the problem?
20. Code Coverage
• Context is king
• Who decides the arbitrary “acceptable”
%age?
• Who decides what code needs to be
tested?
21. Maintainability Index
= MAX(0,(171 - 5.2 * log(Halstead Volume) -
0.23 * (Cyclomatic Complexity) - 16.2 *
log(Lines of Code))*100 / 171)
• Higher number means more maintainable
22. Maintainability Index
• What does it tell us though?
• Comparison of two codebases?
– Only if they have very similar LoC and CC
values
• Comparison of two modules?
• How hard it is to understand the code?
23. Maintainability Index
• With no peer comparison it is an arbitrary
number
• What is an application’s peer though?
24. Generated Code
• Should metrics be applied?
• What does the metric mean?
• How easily can you exclude/separate?
25. The Lie
• Metrics offer emperical evidence about a
codebase
• By themselves, metrics provide meaning
26. The Truth
• Metrics are nothing more than statistics
• Can be made to represent anything
• The numbers, by themselves, represent
nothing
• Interpretation is in the eye of the
beholder
27. The Light Side
• The output numbers mean nothing by
themselves
• Look at interesting comparisons
– Time based
– Module based
– Developer based
• Look for deltas rather than actuals
28. Deltas Over Time
• How did <metric> change
– since last run?
– since last release?
29. Related Metrics
• Code Coverage vs Defect Reports
• Defect Reports vs Code Churn
• Look at the deltas over a time period