Comparing Approaches to Implement Feature Model Composition
-
Upload
acher -
Category
Technology
-
view
1.625 -
download
0
description
Transcript of Comparing Approaches to Implement Feature Model Composition
Comparing Approaches to Implement Feature Model Composition
Mathieu Acher1, Philippe Collet1, Philippe Lahire1, Robert France2
1 University of Nice Sophia Antipolis (France),
Modalis Team (CNRS, I3S Laboratory)
2 Computer Science Department,
Colorado State University
Context: Managing Variability• Constructing a Repository of Medical Imaging Algorithms
– deployable on Grid infrastructures
– services embed the business code and are invoked remotely through standardized protocol
• Highly Parameterized Services
– efficiently extend, change, customize, or configure services for use in a particular context
– reusability and composability
– service as software product line (SPL)• (SOAPL’08, MICCAI-Grid’08)
Comparing Approaches to Implement Feature Model Composition
2
Context: Managing Variability• Constructing a Repository of Medical Imaging Services
– Deployable on Grid infrastructures
– Services embed the business code (e.g., algorithms) and are invoked remotely through standardized protocol
• Highly Parameterized Services
Comparing Approaches to Implement Feature Model Composition
3
AnonymizedFormat
DICOM Nifti Analyze
Modality Acquisition
MRI CT SPEC
T1 T2
PET
Medical Image
And-Group
Optional
Mandatory
Xor-Group
Or-Group
Context: Managing Variability• Constructing a Repository of Medical Imaging Services
– Deployable on Grid infrastructures
– Services embed the business code (e.g., algorithms) and are invoked remotely through standardized protocol
• Highly Parameterized Services
Comparing Approaches to Implement Feature Model Composition
4
AnonymizedFormat
DICOM Nifti Analyze
Modality Acquisition
MRI CT SPEC
T1 T2
PET
Medical Image
And-Group
Optional
Mandatory
Xor-Group
Or-Group InteractiveMethod
Spatial Frequency
Transformation
Linear
Rotation Affine
Non Grid
Registration
And-Group
Optional
Mandatory
Xor-Group
Or-Group
Scaling
Context: Managing Variability• Constructing a Repository of Medical Imaging Services
– Deployable on Grid infrastructures
– Services embed the business code (e.g., algorithms) and are invoked remotely through standardized protocol
• Highly Parameterized Services
Comparing Approaches to Implement Feature Model Composition
5
AnonymizedFormat
DICOM Nifti Analyze
Modality Acquisition
MRI CT SPEC
T1 T2
PET
Medical Image
And-Group
Optional
Mandatory
Xor-Group
Or-Group
InteractiveMethod
Spatial Frequency
Transformation
Linear
Rotation Affine
Non Grid
Registration
And-Group
Optional
Mandatory
Xor-Group
Or-Group
Scaling
FileSizeLimitProcessor
x32 x64
Operating System
Windows Linux
GridComputingNode
And-Group
Optional
Mandatory
Xor-Group
Or-Group
Context: Managing Variability• Constructing a Repository of Medical Imaging Services
– Deployable on Grid infrastructures
– Services embed the business code (e.g., algorithms) and are invoked remotely through standardized protocol
• Highly Parameterized Services
Comparing Approaches to Implement Feature Model Composition
6
AnonymizedFormat
DICOM Nifti Analyze
Modality Acquisition
MRI CT SPEC
T1 T2
PET
Medical Image
And-Group
Optional
Mandatory
Xor-Group
Or-Group
InteractiveMethod
Spatial Frequency
Transformation
Linear
Rotation Affine
Non Grid
Registration
And-Group
Optional
Mandatory
Xor-Group
Or-Group
Scaling
FileSizeLimitProcessor
x32 x64
Operating System
Windows Linux
GridComputingNode
And-Group
Optional
Mandatory
Xor-Group
Or-Group
CryptographicFormat
XML HTTP
HeaderEncoding
NetworkProtocol
And-Group
Optional
Mandatory
Xor-Group
Or-Group
DynamicDimension
Reliability Time
Measurement
QoS
And-Group
Optional
Mandatory
Xor-Group
Or-Group
Issues in Variability Modeling
• Current variability modeling techniques often do not scale up to SPLs with a large number of features.
Comparing Approaches to Implement Feature Model Composition
7
Scalability issues in terms of- construction- evolution- reasoning
Separation of Concerns in SPLs
• Large and monolithic variability model
– Use smaller models representing the variability of well-identified concerns.
– When variability models are separated… composition operators are needed.
• In earlier work, we proposed a set of composition operators for feature models (SLE’09)
• In this work, we focus on the merge operator
Comparing Approaches to Implement Feature Model Composition
8
Purpose and Intended Audience• An efficient, accurate implementation to
automatically merge feature models
• Our interest here:– determine how (MBE/AOM/specific) techniques
perform with feature model merging implementation
– and which techniques are the most suitable.
• Intended audience: – (1) SPL researchers working on feature modeling
techniques or developers of feature modeling tools ;
– (2) researchers/practitioners involved in the MBE/AOM community
Comparing Approaches to Implement Feature Model Composition
9
Agenda
• Background and Motivation
– Feature models and Merge operators
• Requirements for Merge Operators
– Criteria
• Comparison of Different Approaches
– Results
• Conclusion
Comparing Approaches to Implement Feature Model Composition
10
Background: Feature Models
• Hierarchy + Variability
– Mandatory features, Optional features
– Alternatives and Constraints
Comparing Approaches to Implement Feature Model Composition
11
AnonymizedFormat
DICOM Nifti Analyze
Modality Acquisition
MRI CT SPEC
T1 T2
PET
Medical Image
And-Group
Optional
Mandatory
Xor-Group
Or-Group
Background: Feature Models
• Hierarchy + Variability
– Mandatory features, Optional features
– Alternatives and Constraints
Comparing Approaches to Implement Feature Model Composition
12
AnonymizedFormat
DICOM Nifti Analyze
Modality Acquisition
MRI CT SPEC
T1 T2
PET
Medical Image
And-Group
Optional
Mandatory
Xor-Group
Or-Group
Merge Operator: Principles
Comparing Approaches to Implement Feature Model Composition
13
When two feature models (FMs) share several features, there is a need to merge the overlapping parts.
T2
MRI
T1T2
MRI
T1
Merge Operator: Principles
Comparing Approaches to Implement Feature Model Composition
14
FM2
?Semantics properties to preserve
FM1
T2
MRI
T1T2
MRI
T1
T2
MRI
T1
Merge Operator: Union
Comparing Approaches to Implement Feature Model Composition
15
FM2T2
MRI
T1FM1
T2
MRI
T1
{{MRI, T1}, {MRI, T1, T2}}
{{MRI, T1}, {MRI, T2}}
{{MRI, T1},{MRI, T1, T2},{MRI, T2}}
T2
MRI
T1
Merge Operator: Intersection
Comparing Approaches to Implement Feature Model Composition
16
FM2T2
MRI
T1FM1
T2
MRI
T1
MRI
T1
{{MRI, T1}, {MRI, T1, T2}}
{{MRI, T1}, {MRI, T2}}
{{MRI, T1}}
Merge Operator: Requirements (1)
Comparing Approaches to Implement Feature Model Composition
17
no!
OK
Merge Operator: Requirements (1)
Comparing Approaches to Implement Feature Model Composition
18
OK
OK
Not optimal
Nifti is a “dead” feature
Merge Operator: Requirements (1)
Comparing Approaches to Implement Feature Model Composition
19
Everything is OK
Merge Operator: Requirements (2)
Comparing Approaches to Implement Feature Model Composition
20
“Managing Variability in Workflow with Feature Model Composition Operators“Software Composition (SC) conference 2010
Merge Operator: Requirements (3)
The ability of the merge operator to deal with several kinds of input FMs
Comparing Approaches to Implement Feature Model Composition
21
Merge Operator: Requirements (4)
Aspects of the Implementation
Comparing Approaches to Implement Feature Model Composition
22
Now the competition can start!
• Separate FMs
• AGG
• Kompose
• Kermeta
• Boolean Logic– Large spectrum: From modeling/composition
techniques to FM specific solutions
– Some approaches have been proposed by other researchers
Comparing Approaches to Implement Feature Model Composition
24
Separate FMs and Intersection
Comparing Approaches to Implement Feature Model Composition
25
Base Aspect
1. Prime features2. pp’3. Root R with And-group
{{R, A, A’, B, B’}}
Schobbens’ et al. 2007
Separate FMs and Intersection
Comparing Approaches to Implement Feature Model Composition
26
Base Aspect
{{R, A, A’, B, B’}}
+
--
--
--
Separate FMs and Intersection
Comparing Approaches to Implement Feature Model Composition
27
Base Aspect
+
-
Separate FMs and Intersection
Comparing Approaches to Implement Feature Model Composition
28
Base Aspect
++
++
++
Separate FMs and Intersection
Comparing Approaches to Implement Feature Model Composition
29
Base Aspect
++
++
++
AGG
• Attributed Grammar Graph
• Graph Transformation
– Left- Hand Side (LHS): source graph
– Right-Hand Side (RHS): target graph
• Catalogue of merge rules (Segura et al. 2007)
– Only for Union mode
Comparing Approaches to Implement Feature Model Composition
30
AGG and a non-trivial example
Comparing Approaches to Implement Feature Model Composition
31
(Intersection mode)
On the Difficulties of AGG• The semantics properties currently implemented are limited
to the merge in union mode – The intersection mode remains particularly challenging to be
implemented.
• Strategy based on patterns is difficult to realize– AGG expressiveness: non recursive patterns
• Negative application conditions can precisely locate the source of errors.
Comparing Approaches to Implement Feature Model Composition
32
---
--+
Kompose• Generic composition tool (Fleurey, R. France et al.)
• Two major phases: – (1) Matching phase identifies model elements that describe the same
concepts in the input models to be composed;
– (2) In the Merging phase, matched model elements are merged to create new elements in the resulting model.
• Each element type has a signature – two elements with equivalent signatures are merged.
Comparing Approaches to Implement Feature Model Composition
33
Feature and Operator
(Union mode)
Kompose and a non trivial example
• Two major phases: – (1) Matching phase identifies model elements that describe the same
concepts in the input models to be composed;
– (2) In the Merging phase, matched model elements are merged to create new elements in the resulting model.
Comparing Approaches to Implement Feature Model Composition
34
(Intersection mode)
On the Difficulties of Kompose• Compositional approach structured in two-stages (matching and merging)
is too restrictive for implementing an FM-specific merge operator.
• Recursive detection of matching elements is not sufficient since we need a more global vision to decide whether elements should be merged or not
– Post-conditions: Hard to implement
– As Kompose implies local reasonning, handling constraints is not conceivable as well
Comparing Approaches to Implement Feature Model Composition
35
--- -
(Intersection mode)
Experience with Kermeta
• Executable, imperative and object-oriented (meta-)modeling language– Kompose is built on top of Kermeta– we apply the same strategy as with Kompose but without
strictly following the compositional approach
• We gain some benefits, notably a better cover of semantics properties. Now that global and more complex reasoning is possible, some features are not necessary added and less FM errors are generated.
• There is still an issue when dealing with differenthierarchies.
• Finally, the handling of constraints appears to be unpractical.
Comparing Approaches to Implement Feature Model Composition
36
-
Boolean Logic
• The set of configurations represented by a FM can be described by a propositional formula defined over a set of Boolean variables
– A & (A<=>B) & (C=>A)
• We can define the Intersection mode
Comparing Approaches to Implement Feature Model Composition
37
{{A, B}, {A, B, C}}
Boolean Logic
• We have only a Boolean formula: where is the hierarchy? the variability information?
• Czarnecki et al. precisely propose an algorithm to construct a FM from Boolean formula (SPLC’07)– The algorithm constructs a tree with additional nodes for
feature groups that can be translated into a basic FM.
– We preliminary simplify the formula• If φ ∧ f is unsatisfiable, the feature F is dead and can be
removed.
• The feature F can be identified as a full mandatory feature if φ ∧ ¬f is unsatisfiable.
Comparing Approaches to Implement Feature Model Composition
38
Boolean Logic: Strengths and Current Limits
• Experiment on a set of input FMs sharing a same set of features and a same hierarchy. – The algorithm indicates all parent-child relationships (mandatory features) and
all possible optional subfeatures such that the hierarchy of the merged FM corresponds to hierarchies of input FMs.
– And-group, Or-group and X or-group can be efficiently restored in the resulting FM when it was necessary.
• Strengths– The semantics properties are by construction respected. – The technique does not introduce FM errors or does not increase unnecessary
the number of features. – Constraints in FMs can be expressed using the full expressiveness of Boolean
logic and different sets of features can be manipulated.– Apriori detection of error: formula is unsatisfiable
• Current Limits– Hierarchy Mismatch– Explanation
Comparing Approaches to Implement Feature Model Composition
39
Results
Comparing Approaches to Implement Feature Model Composition
40
Model-based composition techniques
• Difficulties. Why?
• The merge of FM is not purely structural – You cannot focus on syntactical properties
– Semantical transformation or Semantics Preserving Model Composition are needed
• A new challenge for modeling tools?– Of course, modeling solutions can be revisited
• Other modeling approaches and technologies can be considered and may emerge to outperform the solutions considered in this paper.
• e.g., Using another Graph Transformation language
Comparing Approaches to Implement Feature Model Composition
41
Conclusion• The implementation of a merge operator for FMs
is an interesting challenge:– We defined a set of criteria to systematically evaluate
an implementation – We compared MBE/AOM/state-of-the-art techniques– We proposed a solution based on Boolean logic that
fulfills most of the criteria• and raises some limitations of our earlier work
• Future Work – Open Issues – diff and refactoring operations for FMs. – Practical use of merge operators in different domains
Comparing Approaches to Implement Feature Model Composition
42
?
Related Work
• Schobbens, P.Y., Heymans, P., Trigaux, J.C., Bontemps, Y.: Generic semantics of feature diagrams. Comput. Netw. 51(2) (2007) 456–479
• Segura, S., Benavides, D., Ruiz-Cortés, A., Trinidad, P.: Automated merging of feature models using graph transformations. Post-proceedings of the Second Sum-mer School on GTTSE 5235 (2008) 489–505
• Fleurey, F., Baudry, B., France, R.B., Ghosh, S.: A generic approach for automatic model composition. In Giese, H., ed.: MoDELS Workshops, Springer (2007) 7–15
• Reddy, Y.R., Ghosh, S., France, R.B., Straw, G., Bieman, J.M., McEachen, N., Song, E., Georg, G.: Directives for composing aspect-oriented design class models. Transactions on Aspect-Oriented Software Development 3880 (2006) 75–105
• Czarnecki, K., Wasowski, A.: Feature diagrams and logics: There and back again. In: SPLC 2007. (2007) 23–34
• Acher, M., Collet, P., Lahire, P., France, R.: Composing Feature Models. In: 2nd Int’l Conference on Software Language Engineering (SLE’09). LNCS (2009) 20
Comparing Approaches to Implement Feature Model Composition
44
A non-trivial example
Comparing Approaches to Implement Feature Model Composition
45