Welcome to SENG 480A / CSC 485A / CSC 586A Self-Adaptive...
Transcript of Welcome to SENG 480A / CSC 485A / CSC 586A Self-Adaptive...
-
Welcome toSENG 480A / CSC 485A / CSC 586A
Self-Adaptive and Self-Managing Systems
Hausi A. MüllerDepartment of Computer Science
University of Victoria
http://courses.seng.uvic.ca/courses/2015/summer/seng/480ahttp://courses.seng.uvic.ca/courses/2015/summer/csc/485ahttp://courses.seng.uvic.ca/courses/2015/summer/csc/586a
1
-
Don’t forget … Take a control course before you graduate! Add some of the skills acquired in this course
to your résumé Take advantage of this amazing point in time
at the confluence of several technologies with the transformative opportunities of cyber-physical systems
All the very best for your career!2
-
Announcements A4
Due Friday, July 31 Adaptive control
Marks Midterm 2 are posted Office hours for
Midterm 2 on Tuesday, Aug 4 1:30-2:30 pm in ECS 412
Final marks will be posted by Aug 4
Please complete teaching evaluation today Only 15% completion rate so far
Teaching evaluations Complete CES at
http://ces.uvic.ca
3
-
4
July 27 and July 30 CSC 586A Presentations
-
Graduate StudentResearch Paper Presentations
5
-
Utilizing Green Energy Prediction to ScheduleMixed Batch and Service Jobs in Data Centers
Baris Aksanli, Jagannathan Venkatesh, Liuyi Zhang, Tajana Rosing
Junnan Lu Francis Harrison
1Aksanli, J. Venkatesh, L.Z., Tajana R.: Utilizing Green Energy Prediction to Schedule Mixed Batch and Service Jobs in Data Centers. In: Proceedings 4th Workshop on Power-Aware Computing and System (HotPower 2011), Article 5 (2011)
http://sigops.org/sosp/sosp11/workshops/hotpower/05-aksanli.pdfhttp://sigops.org/sosp/sosp11/workshops/hotpower/05-aksanli.pdfhttp://sigops.org/sosp/sosp11/workshops/hotpower/05-aksanli.pdfhttp://sigops.org/sosp/sosp11/workshops/hotpower/05-aksanli.pdfhttp://sigops.org/sosp/sosp11/workshops/hotpower/05-aksanli.pdf
-
Outline
● Motivation● Renewable Energy● Related work● Solutions● Results and Contributions● Conclusion
2
-
Motivation● Data centers consume a lot of power
○ Millions of MWh, reflected as
billions of dollars in the electricity
bills.
○ Tons of carbon emissions to the
atmosphere.
3
-
Renewable Energy
The IDEA
4
-
The Problem
❏ Green energy output is very susceptible to environmental changes.
❏ Difficult to allocate for time sensitive applications.
Renewable energy supply is highly variable which decreases the energy usage efficiency.
5
-
Related Work
❏ Energy Prediction (Solar and Wind)
❏ Prediction becomes inaccurate with frequent weather
changes.
❏ Using green energy in Data centers
❏ Problem of variability in renewable energy supply is
unaddressed 6
-
Energy PredictorSolar Predictor:
● Estimated Weighted Moving Average (EWMA)○ Works fine for stable condition.○ above 20% mean error with variable conditions.
● Weather Condition Moving Average (WCMA)○ 10% mean error with variable conditions.
Wind Predictor:● A reference model.● Wind direction (d), wind speed (v), observation time(t).
Mean Error: 17.5% for 30 minutes prediction interval.
7
-
Predictive VS Instantaneous Scheduler
Predictive Scheduler Instantaneous Scheduler
1. 30 minutes prediction interval.2. Provides scheduling info for next
round.3. Making decision for next round,
based on current parameters.
1. 1 minute prediction interval.2. No next round scheduling provided.3. No decision making for future
scheduling.
8
-
GE Prediction And Data Center Modeling1. Client Requests Queue.2. Batch Jobs Queue.3. Batch Jobs Slots4. Job Scheduling based on GE Policy.
9
-
Results And Paper Contributions ❏ A novel, low overhead wind predictor which utilizes the data that has been strongly correlated to the wind speed, direction
and amount of power generated.❏ The mean error for 30 minutes period is around 17.2% which is outperforming the result from the time serious method
described in A. Kusaik, H. Zheng, Z. Song. “Short term prediction of wind farm power: A Data Mining approach”. IEEE TEC, Vol. 24, No. 1, pp. 125-136, March 2009.
❏ Predictive and Instantaneous green energy based schedulers.
10
-
How does this paper related to this course
11
With 8% prediction error, the GE usage efficiency is 3x better than the instantaneous policy. And the GE computation utilization is 7.7x better than the instantaneous policy.
-
Conclusion
12
-
SmarterDeals
Research By:Ebrahimi, S., Villegas, N.M., Müller, H.A., Thomo, A.: SmarterDeals: a context-
aware deal recommendation system based on the SmarterContext engine.
CASCON 2012: 116-130 (2012)
Presented By: Carlene Lebeuf & Maria Ferman 1
http://dl.acm.org/citation.cfm?id=2399788http://dl.acm.org/citation.cfm?id=2399788http://dl.acm.org/citation.cfm?id=2399788http://dl.acm.org/citation.cfm?id=2399788
-
Problem/Motivation
2
● Daily-deals ○ online advertising strategy
● Groupon○ doesn’t filter properly○ overwhelms users
● Need for Improvement!○ Increased revenue○ Improve user experience○ Deliver coupons relevant to users○ SmarterDeals!!!
-
Related Work
● Adomavicius and Tuzhilin (2005, 2008)○ Multidimensional approach focusing on contextual
information in addition to typical information○ Contextual Pre-filtering, contextual post-filtering,
and contextual modeling● Anand and Mobasher (2007)
○ Based on a cognitive science approach○ User model based on...
■ short term memory (current interactions)■ long term memory (previous rating interactions)
● SmarterContext (2012)○ Discussed in further detail later...
3
-
4
Dynamic context management infrastructure that monitors the interactions of users with web entities in order to collect relevant information about the user’s situation and preferences.
Components:● The SmarterContext ontology ● The service-oriented software infrastructure● The user’s Personal Context Sphere
SmarterContext
-
SmarterContext Engine ● Context
Representation and Reasoning
5
● Context Management
SmarterContext ontology
Monitoring user’s web interactions
RDF graph of a user’s ranking interaction
-
● Improves by taking into account the user’s personal context information
6
SmarterDeals(A Case Study)
-
Collaborative Filtering(User Based Collaborative Filtering)
● Suggesting items that users with similar preferences have rated positively
● Calculate similarities between users with the following similarity measures:1. Traditional - Pearson Correlation & Weighted Sum2. Adapted Netflix - Adjusted Collaborative Filtering (ACF)3. Proposed Algorithm - Hierarchical ACF
● Test these approaches with the Yelp Dataset...7
-
● Predict unknown ratings for users for products○ based on similar users
8
Traditional Collaborative Filtering(Baseline #1)
● Find a set of similar users:○ Using Pearson Correlation
Coefficient (threshold = 0.7)
● Predict unknown rating:○ Using weighted sum aggregation
function
-
9
Adjusted Collaborative Filtering (ACF)(Baseline #2)
● Accounts for differences in users rating tendencies○ removes user-item interactions
● Calculate the baseline predictor for similar users○ based on average rating (μ), item
deviation (bi), user deviation (bu)
● Predict unknown rating:○ Adjusted Collaborative Filtering with
user similarity > 0.7
-
10
Hierarchal Adjusted Collaborative Filtering(Authors’ Proposed Algorithm)
● Predictions based on Groupon’s parent categories○ Accounts for sensitivity of categories
● User Similarity:○ Pearson Correlation Coefficient
● User Baseline Predictor:○ Calculated per Category (Pk)
● Predict unknown rating:○ Adjusted Collaborative Filtering
with user similarity > 0.7
-
Validation
1. For each approach:a. Create a new Yelp Dataset
○ remove an individual rating (rui)b. Predict the rating (řui)
○ determined the error (eui= rui - řui)c. Repeat by...
○ Calculating eui for all ratings in the dataset2. Calculate the Root Mean Square Error (RMSE)
○ For the proposed algorithm they calculate RMSE for each category
11
-
12
Validation(Summary of Results)
Overall the approach was...● 8.1% more accurate
than Classic Collaborative Filtering
● 2.0% more accurate Adjusted Collaborative Filtering
… but the approach performed poorly on...● Professional
Services● Public Services &
Government
-
SmarterDeals & Course Material
● The Age of Context!
● Situational Awareness○ Perception of environment & personal
context in terms of time and space○ Comprehension of its meaning
● Software Engineering at Runtime○ Requirements @ Runtime○ Analysis @ Runtime○ …
13
-
14
Thank you!Questions?
Ebrahimi, S., Villegas, N.M., Müller, H.A., Thomo, A.: SmarterDeals: a
context-aware deal recommendation system based on the
SmarterContext engine. CASCON 2012: 116-130 (2012)
http://dl.acm.org/citation.cfm?id=2399788http://dl.acm.org/citation.cfm?id=2399788http://dl.acm.org/citation.cfm?id=2399788http://dl.acm.org/citation.cfm?id=2399788
-
Summary of Contributions
● Improved recommendation based on continually changing contextual information○ The ability to dynamically adapt
■ Changes in locations■ User behaviour / preferences
● Validation of existing collaborative filtering algorithms○ Traditional Collaborative Filtering○ Adjusted Collaborative Filtering (Netflix Algorithm)
15
-
A Framework for Evaluating Quality Driven Self Adaptive Software Systems.
Parminder KaurNavpreet KaurCSC 586a
Villegas, Muller Tamura, Duchien CassallasUniversity of Victoria University of Lille University of LosAngels
Villegas, N.M., Müller, H.A., Tamura, G., Duchien, L., Casallas, R.: A framework for evaluating quality‐driven self‐adaptive software systems. In: Proc. 6th Int. Symposium on SoftwareEngineering for Adaptive and Self‐Managing Systems (SEAMS 2011), pp. 80‐89 (2011) 16
-
Model: Eight Analysis dimensions
1. Adaptation goal2. Reference
Input
SLA/ SLO
Constraints
Regular Expression
Single Value( Physical/logica17
-
3. Measured Outputs
Specified
Monitored
Continuous domain for single variable/signalsLogical expressions/conditions for contract statesCondition expressing for states of system malfunction
Measurement on physical properties from physical devices( CPU temprature)Measurements on logical properties of computational elements(CPU load in hardware)Measurements on external context conditions(weather condition)
4. Computation Control Actions
CCA Discrete Operations that effect process of managed system (resume, sleep, halt) the infrastructure executing the managed resource ( host systems buffer allocation) managed system’s software architecture (reconfiguration operations). behavioural properties of managedsystem.
18
-
5. System structure
ADATATION CONTROLLER MANAGED SYSTEMIdentifies option for controller structureAre variations of MAPE‐K loop with eitherModels of managed system.
Feedback controlAdaptive ControlReconfigurable ControlModifiable controller
Non‐Modifiable structuree.g. Monolithic system
Modifiable structuree.g. Reconfigurable softwarearchitecture
6.Observable Adaptation Properties
(Quality/characteristic)
ADATATION CONTROLLER MANAGED SYSTEM
Behavioural/functional invariantsQOS
Dependability(availability, reliability)Security(confidentiality, integrity)
AccuracyStabilitySettling timeSmallovershootRobustnessTerminationConsistencySecurity
19
-
7. Proposed Evaluation
8.Identified Metrics
To identify the strategyproposed to evaluate themselves
Correspond to metric that is used to measure the Adaptation’s variables of interest In the analysed approach.
20
-
Analysis of self adaptive approaches
21
-
22
-
Evaluation of self adaptation
• Identify Adaptation goals• Identify adaptation properties• Map quality attributes used to evaluate the managed system to properties that evaluate the controller but are observable on managed system.
• Define metrics to evaluate properties observable on managed system and controller.
23
-
Quality Attribute as adaptation goals
• Performance‐ timeliness of service. Time required for the system to respond to events.
• Dependability‐ level of reliance that can justifiably be placed on service.
• Security‐ confidentiality, integrity, availability.• Safety‐ concerned with occurrence of accidents, defined in terms of external consequences.
24
-
Adaptation Properties• Stability
The degree in that Adaptation process will converge toward the control objective.
• AccuracyHow close the managed system approximates to the desired state.
• Short settling timeTime required for adaptation system to achieve the desired state.
• Small OvershootManaging resource overshoot is important to avoid the system instability.
• ConsistencyAims at ensuring the structural and behavioural integrity of managed system after performing an adaptation process.
• RobustnessManaged System must remain stable and guarantee the accuracy even if the managed state differs from the expected state in some measured way.
• Termination(of adaptation process)Planner in MAPE‐K loop produces discrete controlling actions to adapt managed system , such as list of component‐based architecture operations. Termination process guarantees that this list is finite and execution will finish, even if system does not reach the desired state.
• SecurityTarget System, data and components sharedwith controller are required to be protected from disclosure, destruction.
• ScalabilityCapability of controller to support increasing demands of work with sustained performance.
25
-
Mapping Adaptation properties to quality attributes
Classification of Adaptation properties
26
-
Adaptation Metrics
• Adaptation properties provide way for evaluating adaptive systems.
• Just as evaluating most properties is impossible to observe controller itself, but evaluation of these properties by means of observing quality attributes on managed system.
• To identify relevant metrics, this paper characterize Factors that effect evaluation of quality attributes such as memory usage, throughput, response time, processing time, mean time to failure, mean time to repair.
27
-
Conclusion
• Framework is based on survey of self adaptive system’s papers and set of adaptation properties derived from control theory properties.
• Mapping between these properties and software quality properties.
• Future work will focus on the validation of adaptation properties and their quality attributes as proposed in this paper through evaluation of existing adaptive systems.
28
-
Thank you!
29
-
7/30/2015 30
DYNAMICO: A Reference Model in Self-Adaptive
Systems
CSC-586A Self Adaptive & Managing SystemsBabak Tootoonchi / Arturo Reyes Lopez
Norha M. Villegas1,4, Gabriel Tamura2,3,4, Hausi A. Müller1, Laurence Duchien2 ,Rubby Casallas3
1 University of Victoria, Victoria, Canada2University Lille, Lille, France
3University of Los Andes, Bogota, Colombia4Icesi University, Cali, Colombia
-
7/30/2015
Introduction
• How to evolve monitoring infrastructure?
Feedback Control System Autonomic Manager
-
7/30/2015
DYNAMICO Definition
• Reference Model • Explicit concerns:
– Management of Control Objectives– Independent Context Management– Adaptation to New Goals
32
Decouple Feedback Loop
-
7/30/2015 33
Levels of Dynamics
Control Objective
Adaptation
Monitoring
-
7/30/2015
The Control Objective Feedback Loop (CO-FL)
• Control Objectives: System’s behavior• Changes in Goals• Implements MAPE Loop• Input: Monitor Observations & User Request
34
Objective Monitor
Objective Analyzer
Objectives Controller
Planner Executor
Context Monitor
Control ObjectivesSymptoms
Control ObjectivesDifferences
Control ObjectivesOutputs
Control Error (Context Analyzer)
Adaptation Analyzer
User
-
7/30/2015
The Adaptation Feedback Loop (A-FL)
• Aware of Objectives• Considers Monitored Context• Implement MAPE Loop• Controls Target System @Runtime
35
Adaptation Monitor
Adaptation Analyzer
System Adaptation Controller
Preprocessing
Planner Executor
Target System
Control Symptoms
Control Error
Control Input
Measured ControlOutput
PreprocessedSystem Output
TargetSystem
-
7/30/2015
The Monitoring Feedback Loop (M-FL)
• Manages Context Information• Controls Target System’s (Sensors)• Implement MAPE Loop• Changes @ Runtime
36
Context Monitor
Context Analyzer
Context Adaptation Controller
Preprocessing
Planner Executor
Context Manager
Control Symptoms
Control Error
Control Input
Measured ControlOutput
PreprocessedSystem Output
ContextManager
-
7/30/2015 37
Reference Model
-
7/30/2015
Conclusions
• Control Objective-Feedback Loop:Changes in goals & requirements
• Adaptation Feedback Loop:Changes at target system level
• Dynamic Monitoring Feedback Loop:Changes in monitoring infrastructure
38
-
7/30/2015 39
Course Reflections
•Negative Feedback Loop (Control Theory)
•Context Awareness
•MAPE-K Loop
Knowledge information: symptoms, historical information
-
Thank you
-
I WISH YOU ALL THE BEST FOR YOUR SCIENCE AND ENGINEERING CAREERS
Enjoy your time at UVic and in Victoria!
41