1. IMPROVING ANALOGY SOFTWARE EFFORT ESTIMATION USING FUZZY FEATURE SUBSET SELECTION ALGORITHM Mohammad Azzeh [email_address] Dr. Daniel Neagu [email_address] Prof. Peter Cowling [email_address] Computing Department, School of informatics University of Bradford Promise’08
2.
3.
4.
5.
6.
7.
8. The proposed Solution Using Fuzzy c-Means Using Partition matrix and clusters centres Data set with M features Select a subset feature with N dimension
9.
10.
11.
12.
13. Results...ISBSG Algorithm used in analogy model One analogy mean of 2 analogies mean of 3 analogies All features 37.74% 41.0% 29.4% Exhaustive search, Hill climbing, Random search 28.25% 30.3% 30.2% Forward subset selection 33.3% 30.4% 31.2% Backward subset selection 34.7% 38% 34.4% FFSS 28.7% 30.6% 32.2%
14. Results...ISBSG Algorithm used in analogy model One analogy mean of 2 analogies mean of 3 analogies All features 31.6% 33% 20.62% Exhaustive search. Hill climbing, Random search 21.9% 24% 20.8% Forward subset selection 21.0% 21.4% 20.7% Backward subset selection 22.6% 28.7% 25.2% FFSS 21.8% 22.3% 22.7%
15. Results...Desharnais Algorithm used in analogy model One analogy mean of 2 analogies mean of 3 analogies All features 60.1% 51.5% 50.0% Exhaustive search, Forward subset selection, Hill climbing, Random search 38.2% 39.4% 36.4% Backward subset selection 42.4% 43.9% 46.6% FFSS 40.2% 40.3% 38.5%
16. Results...Desharnais Algorithm used in analogy model One analogy mean of 2 analogies mean of 3 analogies All features 41.7% 41.0% 36.1% Exhaustive search, Forward subset selection Hill climbing, Random search 30.8% 38.0% 30.9% Backward subset selection 38.4% 37.4% 34.6% FFSS 32.4% 33.3% 31.7%
17.
18. Conclusions…cont Which FSS is suitable ? -Fuzzy Feature subset selection -Forward feature selection, -Backward features selection Exhaustive search and Hill Climbing Accuracy only Less time, and quit reasonable Accuracy And data set is large Dataset size Random Hill Climbing Small Large