Publicité

Comparing Approaches to Implement Feature Model Composition

Associate Professor à Acher
3 Jul 2010
Publicité

Contenu connexe

Publicité

Comparing Approaches to Implement Feature Model Composition

  1. Comparing Approaches to Implement Feature Model Composition Mathieu Acher1, Philippe Collet1, Philippe Lahire1, Robert France2 1 University of Nice Sophia Antipolis (France), Modalis Team (CNRS, I3S Laboratory) 2 Computer Science Department, Colorado State University
  2. Context: Managing Variability • Constructing a Repository of Medical Imaging Algorithms – deployable on Grid infrastructures – services embed the business code and are invoked remotely through standardized protocol • Highly Parameterized Services – efficiently extend, change, customize, or configure services for use in a particular context – reusability and composability – service as software product line (SPL) • (SOAPL’08, MICCAI-Grid’08) Comparing Approaches to Implement 2 Feature Model Composition
  3. Context: Managing Variability • Constructing a Repository of Medical Imaging Services – Deployable on Grid infrastructures – Services embed the business code (e.g., algorithms) and are invoked remotely through standardized protocol • Highly Parameterized Services Medical Image Modality Acquisition Format Anonymized MRI CT SPEC PET DICOM Nifti Analyze And-Group Xor-Group T1 T2 Optional Or-Group Mandatory Comparing Approaches to Implement 3 Feature Model Composition
  4. Context: Managing Variability • Constructing a Repository of Medical Imaging Services – Deployable on Grid infrastructures – Services embed the business code (e.g., algorithms) and are invoked remotely through standardized protocol • Highly Parameterized Services Medical Image Registration Modality Acquisition Format Anonymized MRI CT SPEC PET DICOM Nifti Analyze And-Group Xor-Group T1 T2 Optional Or-Group Transformation Method Interactive Mandatory Linear Non Grid Spatial Frequency And-Group Xor-Group Optional Or-Group Rotation Scaling Affine Mandatory Comparing Approaches to Implement 4 Feature Model Composition
  5. Context: Managing Variability • Constructing a Repository of Medical Imaging Services – Deployable on Grid infrastructures – Services embed the business code (e.g., algorithms) and are invoked remotely through standardized protocol • Highly Parameterized Services Medical Image GridComputingNode Modality Acquisition Format Anonymized MRI CT SPEC PET DICOM Nifti Analyze And-Group Xor-Group T1 T2 Optional Mandatory Or-Group Operating System Processor FileSizeLimit Registration Windows Linux x32 x64 Transformation Method Interactive And-Group Xor-Group Linear Non Grid Spatial Frequency Optional Or-Group And-Group Xor-Group Optional Or-Group Mandatory Rotation Scaling Affine Mandatory Comparing Approaches to Implement 5 Feature Model Composition
  6. Context: Managing Variability • Constructing a Repository of Medical Imaging Services – Deployable on Grid infrastructures – Services embed the business code (e.g., algorithms) and are invoked remotely through standardized protocol • Highly Parameterized Services Medical Image GridComputingNode Modality Acquisition Format Anonymized Operating System Processor FileSizeLimit MRI CT SPEC PET DICOM Nifti Analyze Windows Linux x32 x64 And-Group Xor-Group T1 T2 Optional Or-Group And-Group Xor-Group Mandatory Optional Or-Group QoS Mandatory Registration NetworkProtocol Measurement Dimension Dynamic Transformation Method Interactive HeaderEncoding Format Cryptographic Reliability Time And-Group Xor-Group Linear Non Grid Spatial Frequency XML HTTP Optional Or-Group Mandatory And-Group Xor-Group And-Group Xor-Group Optional Optional Or-Group Or-Group Mandatory Mandatory Rotation Scaling Affine Comparing Approaches to Implement 6 Feature Model Composition
  7. Issues in Variability Modeling • Current variability modeling techniques often do not scale up to SPLs with a large number of features. Scalability issues in terms of - construction - evolution - reasoning Comparing Approaches to Implement 7 Feature Model Composition
  8. Separation of Concerns in SPLs • Large and monolithic variability model – Use smaller models representing the variability of well-identified concerns. – When variability models are separated… composition operators are needed. • In earlier work, we proposed a set of composition operators for feature models (SLE’09) • In this work, we focus on the merge operator Comparing Approaches to Implement 8 Feature Model Composition
  9. Purpose and Intended Audience • An efficient, accurate implementation to automatically merge feature models • Our interest here: – determine how (MBE/AOM/specific) techniques perform with feature model merging implementation – and which techniques are the most suitable. • Intended audience: – (1) SPL researchers working on feature modeling techniques or developers of feature modeling tools ; – (2) researchers/practitioners involved in the MBE/AOM community Comparing Approaches to Implement 9 Feature Model Composition
  10. Agenda • Background and Motivation – Feature models and Merge operators • Requirements for Merge Operators – Criteria • Comparison of Different Approaches – Results • Conclusion Comparing Approaches to Implement 10 Feature Model Composition
  11. Background: Feature Models • Hierarchy + Variability – Mandatory features, Optional features – Alternatives and Constraints Medical Image Modality Acquisition Format Anonymized MRI CT SPEC PET DICOM Nifti Analyze And-Group Xor-Group T1 T2 Optional Or-Group Comparing Approaches to Implement Mandatory 11 Feature Model Composition
  12. Background: Feature Models • Hierarchy + Variability – Mandatory features, Optional features – Alternatives and Constraints Medical Image Modality Acquisition Format Anonymized MRI CT SPEC PET DICOM Nifti Analyze And-Group Xor-Group T1 T2 Optional Or-Group Mandatory Comparing Approaches to Implement 12 Feature Model Composition
  13. Merge Operator: Principles MRI MRI T1 T2 T1 T2 When two feature models (FMs) share several features, there is a need to merge the overlapping parts. Comparing Approaches to Implement 13 Feature Model Composition
  14. Merge Operator: Principles MRI MRI T1 T2 FM1 FM2 T1 T2 Semantics ? MRI properties to preserve T1 T2 Comparing Approaches to Implement 14 Feature Model Composition
  15. Merge Operator: Union MRI MRI FM1 FM2 T1 T2 T1 T2 { {MRI, T1}, { {MRI, T2} {MRI, T1}, } {MRI, T1, T2} } MRI {{MRI, T1}, {MRI, T1, T2}, T1 T2 {MRI, T2}} Comparing Approaches to Implement 15 Feature Model Composition
  16. Merge Operator: Intersection MRI MRI FM1 FM2 T1 T2 T1 T2 { { {MRI, T1}, {MRI, T1}, {MRI, T2} {MRI, T1, T2} } MRI } { {MRI, T1} } T1 Comparing Approaches to Implement 16 Feature Model Composition
  17. Merge Operator: Requirements (1) OK no! Comparing Approaches to Implement 17 Feature Model Composition
  18. Merge Operator: Requirements (1) Not optimal OK OK Nifti is a “dead” Comparing Approaches to Implement Feature Model Composition feature 18
  19. Merge Operator: Requirements (1) Everything is OK Comparing Approaches to Implement 19 Feature Model Composition
  20. Merge Operator: Requirements (2) “Managing Variability in Workflow with Feature Model Composition Operators“ Software Composition (SC) conference 2010 Comparing Approaches to Implement 20 Feature Model Composition
  21. Merge Operator: Requirements (3) The ability of the merge operator to deal with several kinds of input FMs Comparing Approaches to Implement 21 Feature Model Composition
  22. Merge Operator: Requirements (4) Aspects of the Implementation Comparing Approaches to Implement 22 Feature Model Composition
  23. Now the competition can start! • Separate FMs • AGG • Kompose • Kermeta • Boolean Logic – Large spectrum: From modeling/composition techniques to FM specific solutions – Some approaches have been proposed by other researchers Comparing Approaches to Implement 24 Feature Model Composition
  24. Separate FMs and Intersection Base Aspect Schobbens’ et al. 2007 1. Prime features 2. pp’ 3. Root R with And-group {{R, A, A’, B, B’}} Comparing Approaches to Implement 25 Feature Model Composition
  25. Separate FMs and Intersection Base Aspect {{R, A, A’, B, B’}} -- + -- -- Comparing Approaches to Implement 26 Feature Model Composition
  26. Separate FMs and Intersection Base Aspect + - Comparing Approaches to Implement Feature Model Composition 27
  27. Separate FMs and Intersection Base Aspect ++ ++ ++ Comparing Approaches to Implement 28 Feature Model Composition
  28. Separate FMs and Intersection Base Aspect ++ ++ ++ Comparing Approaches to Implement Feature Model Composition 29
  29. AGG • Attributed Grammar Graph • Graph Transformation – Left- Hand Side (LHS): source graph – Right-Hand Side (RHS): target graph • Catalogue of merge rules (Segura et al. 2007) – Only for Union mode Comparing Approaches to Implement 30 Feature Model Composition
  30. AGG and a non-trivial example (Intersection mode) Comparing Approaches to Implement 31 Feature Model Composition
  31. On the Difficulties of AGG • The semantics properties currently implemented are limited to the merge in union mode – The intersection mode remains particularly challenging to be implemented. • Strategy based on patterns is difficult to realize – AGG expressiveness: non recursive patterns • Negative application conditions can precisely locate the source of errors. + -- - - - Comparing Approaches to Implement 32 Feature Model Composition
  32. Kompose • Generic composition tool (Fleurey, R. France et al.) • Two major phases: – (1) Matching phase identifies model elements that describe the same concepts in the input models to be composed; – (2) In the Merging phase, matched model elements are merged to create new elements in the resulting model. • Each element type has a signature Feature and Operator – two elements with equivalent signatures are merged. (Union mode) Comparing Approaches to Implement 33 Feature Model Composition
  33. Kompose and a non trivial example • Two major phases: – (1) Matching phase identifies model elements that describe the same concepts in the input models to be composed; – (2) In the Merging phase, matched model elements are merged to create new elements in the resulting model. (Intersection mode) Comparing Approaches to Implement 34 Feature Model Composition
  34. On the Difficulties of Kompose • Compositional approach structured in two-stages (matching and merging) is too restrictive for implementing an FM-specific merge operator. • Recursive detection of matching elements is not sufficient since we need a more global vision to decide whether elements should be merged or not – Post-conditions: Hard to implement – As Kompose implies local reasonning, handling constraints is not conceivable as well (Intersection mode) - - -- Comparing Approaches to Implement 35 Feature Model Composition
  35. Experience with Kermeta • Executable, imperative and object-oriented (meta-)modeling language – Kompose is built on top of Kermeta – we apply the same strategy as with Kompose but without strictly following the compositional approach • We gain some benefits, notably a better cover of semantics properties. Now that global and more complex reasoning is possible, some features are not necessary added and less FM errors are generated. • There is still an issue when dealing with different hierarchies. • Finally, the handling of constraints appears to be unpractical. - Comparing Approaches to Implement 36 Feature Model Composition
  36. Boolean Logic • The set of configurations represented by a FM can be described by a propositional formula defined over a set of Boolean variables {{A, B}, {A, B, C}} – A & (A<=>B) & (C=>A) • We can define the Intersection mode Comparing Approaches to Implement 37 Feature Model Composition
  37. Boolean Logic • We have only a Boolean formula: where is the hierarchy? the variability information? • Czarnecki et al. precisely propose an algorithm to construct a FM from Boolean formula (SPLC’07) – The algorithm constructs a tree with additional nodes for feature groups that can be translated into a basic FM. – We preliminary simplify the formula • If φ ∧ f is unsatisfiable, the feature F is dead and can be removed. • The feature F can be identified as a full mandatory feature if φ ∧ ¬f is unsatisfiable. Comparing Approaches to Implement 38 Feature Model Composition
  38. Boolean Logic: Strengths and Current Limits • Experiment on a set of input FMs sharing a same set of features and a same hierarchy. – The algorithm indicates all parent-child relationships (mandatory features) and all possible optional subfeatures such that the hierarchy of the merged FM corresponds to hierarchies of input FMs. – And-group, Or-group and X or-group can be efficiently restored in the resulting FM when it was necessary. • Strengths – The semantics properties are by construction respected. – The technique does not introduce FM errors or does not increase unnecessary the number of features. – Constraints in FMs can be expressed using the full expressiveness of Boolean logic and different sets of features can be manipulated. – Apriori detection of error: formula is unsatisfiable • Current Limits – Hierarchy Mismatch – Explanation Comparing Approaches to Implement 39 Feature Model Composition
  39. Results Comparing Approaches to Implement 40 Feature Model Composition
  40. Model-based composition techniques • Difficulties. Why? • The merge of FM is not purely structural – You cannot focus on syntactical properties – Semantical transformation or Semantics Preserving Model Composition are needed • A new challenge for modeling tools? – Of course, modeling solutions can be revisited • Other modeling approaches and technologies can be considered and may emerge to outperform the solutions considered in this paper. • e.g., Using another Graph Transformation language Comparing Approaches to Implement 41 Feature Model Composition
  41. Conclusion • The implementation of a merge operator for FMs is an interesting challenge: – We defined a set of criteria to systematically evaluate an implementation – We compared MBE/AOM/state-of-the-art techniques – We proposed a solution based on Boolean logic that fulfills most of the criteria • and raises some limitations of our earlier work • Future Work – Open Issues – diff and refactoring operations for FMs. – Practical use of merge operators in different domains Comparing Approaches to Implement 42 Feature Model Composition
  42. ?
  43. Related Work • Schobbens, P.Y., Heymans, P., Trigaux, J.C., Bontemps, Y.: Generic semantics of feature diagrams. Comput. Netw. 51(2) (2007) 456–479 • Segura, S., Benavides, D., Ruiz-Cortés, A., Trinidad, P.: Automated merging of feature models using graph transformations. Post-proceedings of the Second Sum- mer School on GTTSE 5235 (2008) 489–505 • Fleurey, F., Baudry, B., France, R.B., Ghosh, S.: A generic approach for automatic model composition. In Giese, H., ed.: MoDELS Workshops, Springer (2007) 7–15 • Reddy, Y.R., Ghosh, S., France, R.B., Straw, G., Bieman, J.M., McEachen, N., Song, E., Georg, G.: Directives for composing aspect-oriented design class models. Transactions on Aspect-Oriented Software Development 3880 (2006) 75–105 • Czarnecki, K., Wasowski, A.: Feature diagrams and logics: There and back again. In: SPLC 2007. (2007) 23–34 • Acher, M., Collet, P., Lahire, P., France, R.: Composing Feature Models. In: 2nd Int’l Conference on Software Language Engineering (SLE’09). LNCS (2009) 20 Comparing Approaches to Implement 44 Feature Model Composition
  44. A non-trivial example Comparing Approaches to Implement 45 Feature Model Composition
Publicité