On the Design of a Testbed for AOSD

46
Computing Department On the Design of a Testbed for AOSD Alessandro Garcia May 2007

description

On the Design of a Testbed for AOSD. Alessandro Garcia. May 2007. Key Researchers. Lancaster – UK Phil Greenwood  , Alessandro Garcia Eduardo Figueiredo, Nelio Cacho, Claudio Sant’Anna, Americo Sampaio, Awais Rashid Recife – Brazil Sergio Soares, Marcos Dosea, Paulo Borba - PowerPoint PPT Presentation

Transcript of On the Design of a Testbed for AOSD

Page 2: On the Design of a Testbed for AOSD

Computing Department

Key Researchers

• Lancaster – UK– Phil Greenwood

, Alessandro Garcia– Eduardo Figueiredo, Nelio Cacho, Claudio Sant’Anna, Americo Sampaio,

Awais Rashid• Recife – Brazil

– Sergio Soares, Marcos Dosea, Paulo Borba• Kiel – Germany & Waterloo – Canada

– Thiago Bartolomei

• Lisbon – Portugal– Joao Araujo, Ana Moreira, Isabel Brito, Ricardo Argenton

• Malaga – Spain– Monica Pinto, Lidia Fuentes

• Salvador & Natal – Brazil – Thais Batista, Christina Chavez, Lyrene Silva

• Other Contributors: Milan/Italy, Fraunhofer/Germany, Colorado/USA, Rio/Brazil, INRIA/France, Siemens/Germany…

Page 3: On the Design of a Testbed for AOSD

Computing Department

AOSD: from embryonic techniques…

• … to integration and testing in real-world settings• Growing need to assess AO methodologies

– AOSD is becoming a sufficiently established research community• Need to compare AO approaches with other contemporary

modularization approaches• Creation of an experimental environment for end-to-end

evaluation of AOSD techniques– requirements– architecture– design– implementation– maintenance

Page 4: On the Design of a Testbed for AOSD

Computing Department

Uncountable barriers

• Available systems lack proper documentation• Difficult to find multiple AO and non-AO implementations for

the same system– even worst: guarantee that the non-AO and AO decompositions are

good ones is a tricky activity• PhD research studies: difficult to find or develop from

scratch a plausible “benchmark”– many risks: time-consuming task, inherent bias, etc…– collaboration is the only alternative left

• Quantitative or qualitative indicators are often NOT ready for use

• Replication of studies becomes a pain

Page 5: On the Design of a Testbed for AOSD

Computing Department

A Testbed for AOSD

• Towards more scientific and cohesive research– serve as a communication and collaboration vehicle

• achieve widely-accepted exemplars, indicators, and data that can be reused and refined

– facilitate the identification of “unknown” problems and benefits inherent to AOSD• effects throughout the lifecycle

– bottlenecks specific to certain SE phases and their transitions– accelerate the progress in the area by offering context to

pinpoint technique-specific problems

Page 6: On the Design of a Testbed for AOSD

Computing Department

Testbeds vs. Software Engineering

• Recent recognition of the pivotal role of benchmarking on the community cohesion and rapid progress1

• Some fields have faced some progress on benchmarking– e.g. reverse engineering, software refactoring, and program

comprehension• However…

– there is not much work on benchmarking modularization techniques

– reports about the process of designing, instantiating, and evolving benchmarks in software engineering is rare

1S. Sim, S. Easterbrook, R. Holt. Using Benchmarking to Advance Research: A Challenge to Software Engineering. Proc. 25th Intl. Conf. on Software Engineering, Portland, Oregon, pp. 74-83, 3-10 May, 2003.

Page 7: On the Design of a Testbed for AOSD

Computing Department

Timeline

June2006

proposal accepted

July2006

August2006

December2006

contributions of artefacts starts...

choice of thebenchmark goal

preparation of the1st pilot stability study starts...

new needs identified, e.g.:- concern interaction metrics- redefinition of metrics to CaesarJ- measurement reliability: tool support

choice of thechange scenarios

circulation of thequestionnarie

1st benchmark definition starts... indicators

definition

Test

bed

desi

gnB

ench

mar

k in

stan

tiaio

ns

September2006

October2006

preparation of thepilot AO requirements study starts...

conclusion of the1st study

Page 8: On the Design of a Testbed for AOSD

Computing Department

Outline

• Testbed design: the first benchmark• Testbed elements• Testbed instantiation• Testbed evolution • EA & the Testbed

Page 9: On the Design of a Testbed for AOSD

Computing Department

Testbed design: the first benchmark

• a number of decisions… such as:• application selection

– it should be a system likely to be universally used to different assessment purposes

– ten candidate applications were examined• Tourist Guide System, Pet Store, J2ME Games, CVS Eclipse

Plug-In, OpenORB middleware system, etc.– each application was ranked according to weighted criteria

Page 10: On the Design of a Testbed for AOSD

Computing Department

Selection Criteria

• Examples– availability of AO and non-AO implementations (important)– availability of documentation (least important)– system generality (important)– heterogeneous types of concern interactions (most important)– aspects emerging in different phases (least important)– previous acceptance by the research community (most important)– paradigm neutral (most important)– a variety of crosscutting and non-crosscutting concerns (important)

• e.g. widely-scoped vs. more localized ones• e.g. those requiring different uses of AO mechanisms

– elegance of the AO and non-AO decompositions (important)

Page 11: On the Design of a Testbed for AOSD

Computing Department

Health Watcher (HW) System1

• Java version was developed by a company in Brazil• Several desirable properties

– real-life system– non-trivial– Java and AspectJ implementations available

• elegant OO and AO designs– some requirements, architecture and design documentation available– designed with modularity, reusability, maintainability and stability in mind– used in a reasonable number of studies that report well-accepted non-AO

and AO design decompositions:• OOPSLA.02, FSE.06, S:P&E 2006, ICSM.06, EWSA.06, EA.06,

ESEM.07, etc…• Important that multiple applications are used in the testbed to allow

broad conclusions to be made

1Soares et al. Implementing Distribution and Persistence Aspects with AspectJ. OOPSLA 2002

Page 12: On the Design of a Testbed for AOSD

Computing Department

Health Watcher Architecture

HWServlet

OpServlets

Command

OpCommandsExtraOpCommands

Command Pattern

IFacade

GUI Layer

Distribution Layer

Complaint Employee HealthUnit Speciality Symptom

HealthWatcherFacade

ComplaintRecord EmployeeRecord HealthUnitRecord SpecialityRecord SymptomRecord

ComplaintRep EmployeeRep HealthUnitRep SpecialityRep SymptomRep

ISymptomRepISpecialityRepIHealthUnitRepIEmployeeRepIComplaintRep

Sx+

Sx’

Sx-

Component added in scenario x

Component modified in scenario x

Component removed in scenario x

S8+

S8+

S1-

S8’S8’

S8’ S8’

S8’S8’

S8’

S8’ S8’ S8’

S8’

S8’S1’

S1+

S1+

S8+

Business Layer

Data Layer

S9’

S9’ S9’ S9’ S9’ S9’

S9’

S9’S9’

S9’

Page 13: On the Design of a Testbed for AOSD

Computing Department

Artefacts Repository

• Initially a limited number of approaches have been applied– Requirements (e.g. Use-Cases, V-Graph, AOV-Graph, AORE, AORA)– Architecture (e.g. UML, ACME, AO ADL, AspectualACME, AOGA)– Design (UML, Theme/UML, aSideML)– Implementation (Java, AspectJ, CaesarJ, AWED, JBoss)

• Contributors reported:– strengths and weaknesses of the HW system– issues to be benchmarked

Page 14: On the Design of a Testbed for AOSD

Computing Department

What issues to benchmark?

• Questionnaires sent to a representative set of SE institutions – understand which areas the existing AO techniques…

• … were mature enough– phases: requirements engineering, detailed design and implementation – e.g. “pointcut languages”

• … in evolution stage (e.g. aspect interaction) • … target quality attributes (e.g. enhanced maintainability and reusability)

• Investigation of typical “ilities” in previous empirical studies involving modularization techniques (e.g. OO, AO, etc…):– modularity, maintainability and reusability

• e.g. software stability– reliability

• e.g. error proneness– specification effort and outcome quality

• e.g. time spent, recall, and precision

Page 15: On the Design of a Testbed for AOSD

Computing Department

What issues to benchmark?

• Impact of AO mechanisms on particular SE activities or phases– phases are often assessed in isolation– desirable to determine the affects of one phase on subsequent phases

• E.g. how changes in my AO program impacts the stability of the architecture decomposition (compared with OO program changes)?

• Which motivating comparison?– OO vs. AO? or– Multiple AO techniques

Page 16: On the Design of a Testbed for AOSD

Computing Department

Enhancing HW System…

• … to include changes and produce releases– both widely-scoped and localized changes– changes to both CCCs and non-CCCs– different categories: perfective changes and refactorings, corrective

changes, evolutionary changes, etc… • … to address the identified weaknesses w.r.t.

– our original criteria• e.g. include localized CCCs, such as design patterns

– feedback received from the contributors• e.g. need for improving the categories of aspect interactions

• … based on the history of HW changes in the deployed Java system

Page 17: On the Design of a Testbed for AOSD

Computing Department

Stability Indicators

• Generality– indicators not tied to one specific artefact/technique type

• Traceability in the assessment process– support assessment of effects of one phase on

subsequent phases– SE-wide properties

• modularity: cohesion, coupling, SoC, interface simplicity, etc…

• change impact and stability– concern interaction

Page 18: On the Design of a Testbed for AOSD

Computing Department

Testbed Elements

Design Stability Study Consequence: more mature elements

Page 19: On the Design of a Testbed for AOSD

Computing Department

Outline

• Testbed design: the first benchmark• Testbed elements• Testbed instantiation

– study on architecture and implementation stability1

• Java vs. AspectJ vs. CaesarJ– study on AO requirements engineering2

2A. Sampaio et al. A Comparative Study of Aspect-Oriented Requirements Engineering Approaches. Proc. of the 1st International Symposium on Empirical Software Engineering and Measurement (ESEM.07), September 2007. (to appear)

1P. Greenwood et al. On the Impact of Aspectual Decompositions on Design Stability: An Empirical Study. Proceedings of the 21st European Conference on Object-Oriented Programming (ECOOP.07), July 2007, Germany. (to appear)

Page 20: On the Design of a Testbed for AOSD

Computing Department

Instantiation of the Benchmark (Design Stability Study)

• Application of the selected metric suites to each of the artefacts generated– Java, AspectJ, and CeasarJ programs– Non-AO architecture (N-Tier architecture) vs. AO architecture

• Multi-dimensional analysis, including:– modularity sustenance– observance of architectural and design ripple effects– which categories of aspects (and respective interfaces) have exhibited

or not stability – satisfaction of basic design principles through the releases

Page 21: On the Design of a Testbed for AOSD

Computing Department

Instantiation of the Benchmark (Design Stability Study)

• Outcomes overview+ Concerns aspectized upfront tend to show superior modularity

stability+ AO solutions required less intrusive modification in modules+ Aspectual decompositions have demonstrated superior satisfaction

of the Open-Closed principle- Highlighted the “fragile pointcut” problem: ripple effects observed in

interacting aspect interfaces- AO modifications tended to propagate to seemingly unrelated

modules+ Architectural ripple effects observed only in the OO solution:

undesirable changes relative to exception handling in multiple layers

1P. Greenwood et al. On the Impact of Aspectual Decompositions on Design Stability: An Empirical Study. Proceedings of the 21st European Conference on Object-Oriented Programming (ECOOP.07), July 2007, Germany.

Page 22: On the Design of a Testbed for AOSD

Computing Department

Instantiation of the Benchmark(AO Requirements Study)

2A. Sampaio et al. A Comparative Study of Aspect-Oriented Requirements Engineering Approaches. Proc. of the 1st International Symposium on Empirical Software Engineering and Measurement (ESEM.07), September 2007. (to appear)

Page 23: On the Design of a Testbed for AOSD

Computing Department

Instantiation of the Benchmark(AO Requirements Study)

• comparison of four eminent AORE approaches– time effectiveness (person-minutes)– accuracy of their produced outcome

• precision and recall of the models produced • example of research question:

– which activities are the main bottlenecks in terms of effort for each AORE approach?

• target: 1st author interested in learning which tasks should be automated in the EA-Miner tool

• main outcome: composition specification and conflict analysis

Page 24: On the Design of a Testbed for AOSD

Computing Department

Timeline - Evolution

June2006

proposal accepted

July2006

August2006

December2006

contributions of artefacts starts...

choice of thebenchmark goal

1st pilot stability study starts...

circulation of thequestionnarie

1st benchmark definition starts... indicators

definition

Ben

chm

ark

inst

antia

ions

September2006

October2006

conclusion of the1st study

contributions of artefacts starts...

choice of thebenchmark goal

circulation of thequestionnarie

1st benchmark definition starts... indicators

definition

Test

bed

desi

gn

– lack of architectural changes: added EH – fix bugs encountered– improvement of “alignments”– metrics redefinitions thanks to

CaesarJ mechanisms

– more details in the architecture documentation

– refine architecture metrics– improved definition of

concern interaction metrics

requirementsstudy

– common naming scheme – common activities

Page 25: On the Design of a Testbed for AOSD

Computing Department

Evolution: feedback from the studies

• new categories of crosscutting concerns– implementation level

• checked exceptions: EH aspectization is more challenging– use of exception-softening mechanism

• complex, context-sensitive exception handlers• use of around advice

– detailed design level:• use of design patterns

– plenty of different uses of AO mechanisms (role-based composition, multiple inheritance, etc…)

• Particular aspect interactions still not investigated– more than two aspects sharing the same join point– no presence of pointcuts picking out advice executions

Page 26: On the Design of a Testbed for AOSD

Computing Department

EA and the Testbed

• Status:– repository of AO and non-AO artifacts – no changes have been applied

• Improvements are necessary, e.g.:– there is no detailed problem description

• only use cases; requirements information is missing– most of the requirements-level aspects are directly

mapped to architecture and implementation aspects– alignment of existing AO and non-AO artefacts needs to

be improved– some architecture models are abstract, and some

architectural views are missing

Page 27: On the Design of a Testbed for AOSD

Computing Department

EA and the Testbed

• Elements of the testbed repository have shown to be useful even for unanticipated assessment contexts, e.g. – AO measurement (U. Waterloo – Thiago Bartolomei)– dynamic AO metrics (U. Milan – Walter Cazzola)– AO design heuristics (U. Lancaster – Figueiredo, Sant’Anna, Garcia)– architectural styles and aspects (U. Bologna, U. Lancaster, UFBA, UFRN)

• Used and extended in several ways– Investigate the interplay of AO requirements composition mechanisms and

several attributes• requirements description stability• traceability• change impact analysis• understandability• etc…

Page 28: On the Design of a Testbed for AOSD

Computing Department

EA and the Testbed

• Other lessons learned– it is very difficult to design a proper testbed without the effective

participation of the technique experts• e.g. J. Araujo and A. Moreira (AORE technique)• e.g. T. Bartolomei from CaesarJ team

– testbed is an effective collaboration/communication tool• enables developers/researchers of emerging EA techniques

to communicate• a common set of artefacts• improved problem understanding• not targeted to one specific phase

– developers gain an improved awareness of all development phases

• enables focused discussions at EA workshops– we need more funding $$$

Page 29: On the Design of a Testbed for AOSD

Computing Department

Future Expansions

• Other benchmarks– … for assessing stability in early aspects techniques– … for error proneness

• Expand testbed elements– New applications– Apply more approaches– Develop new metrics

• Testbed repository is a semi-open resource by now• The elements used and generated in the stability study is available at:

www.comp.lancs.ac.uk/~greenwop/ecoop07/

Page 31: On the Design of a Testbed for AOSD

Computing Department

Contributing to the Testbed

• Aim is to become an extensive open resource.• Only a limited number of approaches initially

applied to the testbed.• Requires further contributions form the SE

community.– Applications– New approaches– Metric suites

Page 32: On the Design of a Testbed for AOSD

Computing Department

Summary

• Provided an overview of the various elements that contribute to the testbed.

• Illustrated how traceability can be achieved across development phases in terms of assessing approaches.

• Given a concrete example of how the testbed can be instantiated which can also be achieved in other development phases.

• Highlighted the benefits of using a common testbed for the community.

Page 33: On the Design of a Testbed for AOSD

Computing Department

Other issues

• Important that the testbed is an open resource.• Necessary for users of the testbed to contribute results gathered.

• Repository of data• Guidelines on how to select the benchmarks and

indicators (and previous data)• Validation of the benchmark (which issues should we

consider)?• Plethora of new composition mechanisms in AOSD

– How much they should affect the benchmarks design?• E.g. CaesarJ has feature-oriented programming mechanisms

that are most suited to PLs

Page 34: On the Design of a Testbed for AOSD

Computing Department

Outline

• Provide an overview of the testbed.– Aims– Elements– Design Decisions

• Detail the targeted development phases.– Approaches– Metrics

• Example instantiation of the testbed.– Stability case-study at the implementation phase.

• Subset of results.– Comparison of AORE approaches.– Results of the stability case-study

• Benefits and future work

Page 35: On the Design of a Testbed for AOSD

Computing Department

Testbed design: the first benchmark

• Answer key questions regard the effectiveness of AOSD through the development life-cycle.

• Provide a valuable resource to the software engineering community.

• A common testbed used to assess and compare AO and non-AO approaches.

• A communication vehicle for AO proponents.

Page 36: On the Design of a Testbed for AOSD

Computing Department

• Possible focus of upcoming benchmarks– design stability– error proneness– impact of aspects in adjacent phases

• e.g. requirements -> architecture (traceability, quality of decisions made, etc...)

Page 37: On the Design of a Testbed for AOSD

Computing Department

Achieving Traceability

• Phases are often assessed in isolation• Desirable to determine the affects of one phase on

subsequent phases• Number of attributes are common across development

phases– Concern Interaction– Modularity– Stability– Change Impact

Page 38: On the Design of a Testbed for AOSD

Computing Department

Requirements Phase

• Number of approaches applied– Viewpoint-based AORE– AO Requirement Analysis (AORA)– MDSOC– AOV-Graph

• Difficult to compare varied approaches• Testbed project initiated related work for comparing AORE appraoches2

– Provides common schemes for comparison• Some commonalities exist for comparison

– Effort – time to produce documentation– Modularity

2A. Sampaio et al, “A Comparative Study of Aspect-Oriented Requirements Engineering Approaches”, Proc. of the 1st International Symposium on Empirical Software Engineering and Measurement (ESEM), September 2007. (to appear)

Page 39: On the Design of a Testbed for AOSD

Computing Department

Architecture Design Phase

• A variety of architecture approaches applied.– ACME, AspectualACME, AO-ADL, Aspectual Template, AOSD-Europe Notation.

• A specific metric suite has been developed for assessing architecture design approaches.– Coupling– Cohesion– Interface Complexity– SoC– Interactions

• Other general attributes to measured.– Effort– Stability– Change impact

• These metrics allow correlation to the requirements phase.

Page 40: On the Design of a Testbed for AOSD

Computing Department

Instantiation of the Benchmark (Implementation Phase) (1)

• Aim was to compare/assess stability of AO and non-AO approaches.

• Involved selecting various elements provided by the testbed.– Application, metric suites, etc.

• Apply new approaches to base artefacts (Java/AspectJ implementation) to create new artefacts.– CaesarJ

Page 41: On the Design of a Testbed for AOSD

Computing Department

• Usar o timeline para dar exemplos– Como os estudos retroalimentaram a definicao dos

benchmarks• Change scenarios (different HW releases)

– Can be reused for studies involving traceability, reuse, effectiveness of change impact analysis techniques, etc..

• indicators (concern interaction analysis)• common naming scheme

Page 42: On the Design of a Testbed for AOSD

Computing Department

• Results gathered can influence future development of the testbed• Metrics collected in the stability study highlighted deficiencies in some changes.

– Added additional changes to improve coverage.• Development of new metrics.

– Modularity metrics unable to capture all variations in the code due to their level of granularity.

– Developed and applied change propagation metrics to be able to analyse all phenomenon• to explicitly investigate the differences between AspectJ and CaesarJ.

Page 43: On the Design of a Testbed for AOSD

Computing Department

The Testbed as a Communication Tool

• Enables developers/researchers across phases to communicate.– A common set of artefacts.– Improved problem understanding.– Not targeted to one specific phase.

• Developers gain an improved awareness of all development phases.

• Enables focused discussions at workshops etc.

Page 44: On the Design of a Testbed for AOSD

Computing Department

• Need to establish commonalities between approaches in order for comparisons to be made– tasks

• e.g. concerns, concern interaction, change propagation, modularity

Page 45: On the Design of a Testbed for AOSD

Computing Departmentboth architecture andimplementation measures

Instantiation of the Benchmark(Design Stability Study)

Page 46: On the Design of a Testbed for AOSD

Computing Department

Instantiation of the Benchmark(AO Requirements Study)

• Outcomes Overview– composition is the corner stone of AORE

• Composition specification is a time-consuming activity– improves change management and conflict analysis– this trade-off requires further analysis

• Conflict analysis is also a significant task– composition specification and conflict analysis– future: comparison with non-AO RE approaches