Ce diaporama a bien été signalé.
Nous utilisons votre profil LinkedIn et vos données d’activité pour vous proposer des publicités personnalisées et pertinentes. Vous pouvez changer vos préférences de publicités à tout moment.
Prochain SlideShare
Chargement dans…5
×

# Understanding software metrics

1 757 vues

Publié le

This presentation provides a brief overview about object-oriented metrics such as LOC, NOC, LCOM, CBO, CC, and WMC. A few practical issues are discussed in the presentation such as metric threshold and tools. It also discusses "Abstractness and Instability" diagram.

Publié dans : Données & analyses
• Full Name
Comment goes here.

Are you sure you want to Yes No
Your message goes here

Voulez-vous vraiment ?  Oui  Non
Votre message apparaîtra ici

### Understanding software metrics

1. 1. Understanding Software Metrics Tushar Sharma
2. 2. Tushar Sharma Agenda OO metrics and their classiﬁcation Using metrics in practice Advanced stuff… Metrics: What and why
3. 3. Software Metrics - What & Why? A software metric is a measure of some property of a software system. Why? “Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it.” (by James Harrington)
4. 4. Tushar Sharma Object-oriented Metrics Size Complexity Cohesion & Coupling Project Namespace Type Method LOC NOC NON LOC NOC LOC NOM NOF DIT NC RFC LOC NP WMC CC LCOM CBO Fan-in/ Fan-out
5. 5. Tushar Sharma OO Metrics - Size • Large entities make the comprehension difficult. • Large entities indicate presence of various smells. LOC Lines Of Code NON Number Of Namespaces NOC Number Of Classes NOM Number Of Methods NOF Number Of Fields
6. 6. Tushar Sharma OO Metrics - Size The response set of a class is a set of methods that can potentially be executed in response to a message received by an object of that class. • Larger the number of methods that can be invoked, the greater the complexity of the class. • If a large number of methods can be invoked in response to a message, the testing and debugging of the class becomes more complicated . RFC Response For Class
7. 7. Tushar Sharma OO Metrics - Size (Hierarchy) • Deep inheritance trees make the comprehension difficult. • The deeper a class is in the hierarchy, the greater the number of methods it is likely to inherit, making it more complex to predict its behaviour. • Again, wide inheritance trees make the comprehension difficult. • A wide hierarchy indicates missing intermediate inheritance level that in turn may lead to code duplication. DIT Depth of Inheritance Tree NC Number of Children
8. 8. Tushar Sharma OO Metrics - Cohesion and Coupling Consider a class C with n methods M1, M2..., Mn. Let {Ij} = set of instance variables used by method Mi. There are n such sets I1 ,…, In • P = {(Ii, Ij) | (Ii ∩ Ij ) = ∅} • Q = {(Ii, Ij) | (Ii ∩ Ij ) ≠ ∅} LCOM = |P| - |Q|, if |P| > |Q| = 0, otherwise • High LCOM indicate the violation of the Single Responsibility Principle. LCOM Lack of Cohesion Of Methods
9. 9. Tushar Sharma OO Metrics - Cohesion and Coupling CBO for a class is a count of the number of other classes to which it is coupled. • Excessive coupling between object classes is detrimental to modular design and prevents reuse. The more independent a class is, the easier it is to reuse it in another application. • A measure of coupling is useful to determine how complex the testing of various parts of a design are likely to be. The higher the inter-object class coupling, the more rigorous the testing needs to be. CBO Coupling Between Object classes
10. 10. Tushar Sharma OO Metrics - Complexity • CC - It is a quantitative measure of the number of linearly independent paths through a program's source code. • WMC – Sum of CC of each method in the class • The number of methods and the complexity of methods involved is a predictor of how much time and effort is required to develop and maintain the class. CC Cyclomatic Complexity WMC Weighted Methods per Class
11. 11. Tushar Sharma Using metrics in practice - tools C/C++ CppDepend Infusion Source Monitor C# Designite NDepend Java Infusion Source Monitor
12. 12. Tushar Sharma Using metrics in practice • There is no one single widely accepted standard for metrics thresholds. • There are some guidelines and rules of thumb! Metric - Recommended Threshold LOC (Method) - 50 LOC (File) - 500 NOC - 9 DIT - 5 LCOM - 0.80 NOM - 30 CBO - 6 CC - 10 WMC - 100 % Duplication - 4% Metric thresholds
13. 13. Tushar Sharma Using metrics in practice What if you want to perform an analysis with customized metric thresholds http://www.designite-tools.com
14. 14. Tushar Sharma What can be done with metrics data • Indicator of “Structural Quality” • Refactoring candidates LackofCohesion 0 500 1000 1500 2000 2500 3000 3500 createFaxFaxSender PrintSender SASC ontext SASFoldableTableC ellW idget SASFoldableTable SASSortFoldableTable SASFoldableD ynam icC olum nScrollTable SASFoldableD ynam icTable SASSortTable SASLaterality SASProcedureTree SASSiriusLPA SASProcedureTreeU serO bject PortalApplication PortalR ISApplicationR adiologist SASServerH essianProxy C lientC onfig U serPreferences SASC riteria SASSpeechTextPaneD ispatcher SpeechAudioU tils Classes Lackofcohesionvalue 0 120 180 Threshold • Trend analysis • Proof – Improved quality
15. 15. Tushar Sharma What can be done with metrics data • Detect code/design smells: ✦ Insufficient Modularization - WMC or NOM ✦ Hub-like Modularization – Fan-in/Fan-out ✦ Deep Hierarchy – DIT ✦ Multifaceted Abstraction – LCOM ✦ …
16. 16. Some more metrics ❖ Mood (Metrics for Object-Oriented Design) ✦ Method Hiding Factor (MHF) ✦ ratio of sum of the invisibilities of all methods defined in all classes to the total number of methods ✦ Attribute Hiding Factor (AHF) ✦ Method Inheritance Factor (MIF) ✦ ratio of the sum of the inherited methods in all classes to the total number of available methods (locally defined plus inherited) for all classes ✦ Attribute Inheritance Factor (AIF) ✦ …
17. 17. Tushar Sharma Abstractness Vs Instability • Dependency • say class Copy depends on KeyboardReader • “Good dependency”? • where target of the dependency is stable • stable - does not depend on anything, not going to change • Responsible class • many classes depend on the class • it’s harder to change • and therefore, must be stable
18. 18. Tushar Sharma • Instability • Ca - Afferent coupling (Fan-in) • Ce - Efferent coupling (Fan-out) Abstractness Vs Instability • So, shouldn’t we have all classes as stable classes? • No. System will be unchangeable. • Therefore, we need some portion of the design to be flexible to withstand change. • How? Abstract classes • Abstractness = #abstract classes / #total classes in the group
19. 19. Tushar Sharma Abstractness Vs Instability (1, 0) - Max instable and concrete (0, 0) - Max stable and concrete (0.5, 0.5) - striking the balance (0, 1) - Max stable and abstract Zone of pain Zone of uselessness M ain sequence Abstractness Instability (1, 1) - abstract and max instable (no dependents)
20. 20. Tushar Sharma Let’s conclude!! Strive quality not numbers.
21. 21. Thank you for your attention. tusharsharma@ieee.org