Web Service Discovery and Selection: Pragmatic Approaches Natallia Kokash PhD student, XX cycle...
-
date post
18-Dec-2015 -
Category
Documents
-
view
216 -
download
3
Transcript of Web Service Discovery and Selection: Pragmatic Approaches Natallia Kokash PhD student, XX cycle...
Web Service Discovery and Selection:Pragmatic Approaches
Natallia KokashPhD student, XX cycle
Tutor: Vincenzo D’AndreaAdviser: Marco Aiello
20/10/2006 University of Lecce 2
DSSOC people
□ Marco Aiello <[email protected]>□ Fabio Casati <[email protected]>□ Vincenzo D’Andrea <[email protected]> □ Maurizio Marchese <[email protected]>
□ Ganna Frankova <[email protected]> – 3d year□ Natallia Kokash [email protected] – 3d year□ GR <[email protected]> – 3d year□ Alexander Ivanyukovich
<[email protected]> – 4th year□ Alexander Lazovik <[email protected]> – 5th year
20/10/2006 University of Lecce 3
Scope
□Web Service Composition□Distributed Systems□Web Service Discovery□Quality of Service (Security)□Intellectual Property and Licensing
20/10/2006 University of Lecce 4
Introduction
□Discovery of web services□Structural interface matching□Hybrid methods□Behavioral interface matching
□QoS Issues□Web service selection algorithms□Risk evaluation□Recommendation systems as a tool for
web service selection
20/10/2006 University of Lecce 5
Web Services
□ A Web service is □ a software system □ identified by a URI, □ whose public interfaces and bindings are defined and
described using XML. □ Its definition can be discovered by other software
systems.□ These systems may then interact with the Web service
in a manner prescribed by its definition, using XML based messages conveyed by internet protocols.
[Web Services Architecture, W3C Working Draft 14 November 2002, from http://www.w3.org/TR/ws-arch/ on 5th March 2002]
20/10/2006 University of Lecce 6
SOA and Web Services
Service Registry
(UDDI, ebXML)
Service Requestor
Service Provider
Publ
ish Bind
Find
Service Interface
Descriptions
Service Behavior
Descriptions
Service Quality
Service Service-oriented application
Service Description
20/10/2006 University of Lecce 7
Web Services Discovery
□Matching – meeting the functionality required by a user with specifications of existing services□Generic (heuristics, domain-independent
ontologies)□Personal (preferences, specific functions and
patterns for comparing requests and existing services)
□Community (domain-specific ontologies)□Selection – choosing a service with the
best quality among those able to satisfy the user’s goal
20/10/2006 University of Lecce 8
Thesis objectives
□It is difficult for a user to write a correct request.
□Automated semantic matching is not feasible.
□Users are in different conditions. □QoS characteristics are constantly
changing.□Collective user experience can be used to
improve service selection. □Domain specific quality factors should be
involved in service selection.
20/10/2006 University of Lecce 9
Service Description
□Web Service Description Language (WSDL)□Identity – unique identity of the interface□Input/Output– the meaning of input and output
parameters□Faults – specify the abstract message format
for any error messages that may be output as the result of the operation
□Types – declare data types used in the interface (XML Schema)
□Documentation – natural language service description and usage guide
20/10/2006 University of Lecce 10
Semantic Web Services (1)
□ Managing End-To-End OpeRations (METEOR-S)
□ Semantic Web Services Framework (SWSF) □ Web Service Modelling Ontology (WSMO)□ Ontology Web Language for Services (OWL-S)
□ Preconditions – a set of semantic statements that are required to be true before an operation can be successfully invoked
□ Effects – a set of semantic statements that must be true after an operation completes execution after being invoked.
□ Restrictions – a set of assumptions about the environment that must be true
□ Quality of Service – non-functional parameters such as response time, execution cost, capacity, etc.
20/10/2006 University of Lecce 11
OWL-S
ServiceResource
Service Grounding
ServiceModel
Service Profile
What does the service do?
How does it work?How is it accessed?
presents
describedBy
supports
provides
□ Does not provide context identification□ Does not describe objects used by the service but not provided by the
client□ Does not describe what service does
20/10/2006 University of Lecce 12
Semantic Web Services (2)
□WSDL-S□The semantics of service’s operations
are directly added to WSDL files□Easy to deploy and use□Does not support full features of OWL-S
process ontology
20/10/2006 University of Lecce 13
Motivating exampleGoogleSearch<message name=”doGoogleSearchResponse”> <part name=”return” type=”GoogleSearchResult”/></message>...<complexType name=”GoogleSearchResult”><all> <element name=”searchComments” type=”string”/> <element name=”estimatedTotalResultsCount” type=”int”/> <element name=”resultElements” type=”ResultElementArray”/>...< /all>< /complexType>
WolframSearch<message name=”WolframSearchResponse”> <part element=”WolframSearchReturn”/ ></message>...<element name=”WolframSearchReturn”><complexType><sequence> <element name=”Result” type=”WolframSearchResult”/ >< /sequence>< /complexType>< /element>...<complexType name=”WolframSearchResult”><sequence> <element name=”TotalMatches” type=”int”/ > <element name=”Comment” type=”string”/ > <element name=”Matches” type=”WolframSearchMatchArray”/ >...< /sequence>< /complexType>
20/10/2006 University of Lecce 14
WS Matching Algorithm
□Requirements:□Matched advertisements are returned in
sorted order, according to their degree of match
□For each element a matching confidence is known (easy to see where problems may occur)
□Tries to catch semantics
20/10/2006 University of Lecce 15
Web Service Interface Matching
Name,Description,
Input,Output
….
Name,Description,
Input,Output
…
xy
abc
Operation Operation
Similarity ?
Name,Description,Operations
….
Name,Description,Operations
…
XY
AB
Web Service Web Service
Similarity ?
Z
20/10/2006 University of Lecce 16
WS Matching Algorithm
□ Tokenization□ sequences of more than one uppercase letters□ sequences of an uppercase letter and following lowercase letters□ sequences between two non-word symbolsExample: ”tns:GetDNSInfoByWebAddressResponse” {tns, get, dns, info,
by, web, address, response}.□ Word stemming□ Stopwords removing
Registry Parsing Tagging Indexing
Matching
VSM Semantic Other
Evaluation Hybrid
Query
Ontology Meta-data
□http://dit.unitn.it/~kokash/sources
20/10/2006 University of Lecce 17
Structural Matching
wij
Requestedoperation
Provided operation
Maximum weight bipartite matching Kuhn’s Hungarian method (polynomial time)
Define overall similarity scoreQuery type: similarity or inclusion
NameOperationsDescription
ServiceName
Input messageOutput message
Description
OperationNameParts
Description
MessageNameType
Description
PartName
ElementsDescription
TypeName
Description
Element
20/10/2006 University of Lecce 18
Lexical MatchingMetric
Semantic
VSM+WordNet
Vector-Space Model (VSM)
tf-idf
• semantic matching of word pairs • semantic matching of sentences
Seco, N., Veale, T., Hayes, J.: “An intrinsic information content metric for semantic similarity in WordNet”, ECAI, 2004, pp. 1089-1090
20/10/2006 University of Lecce 19
Experimental Results (Test 1)
10 % 40%
15% 45%VSM Semantic
•40 web services •5 groups
20/10/2006 University of Lecce 22
Related Work1. [Sajjanhar’04] Sajjanhar, A., Hou, J., Zhang, Y.: ”Algorithm for Web Services
Matching”, APWeb, 2004, pp. 665–670.2. [Bruno ’05] Bruno, M., Canfora, G. et al.: ”An Approach to support Web Service
Classification and Annotation”, IEEE International Conference on e-Technology, e-Commerce and e-Service, 2005.
3. [Corella’06] Corella, M.A., Castells, P.: “Semi-automatic semantic-based web service classification”, International Conference on Knowledge-Based Intelligent Information and Engineering Systems, 2006.
4. [Dong’04] Dong, X.L. et al.: ”Similarity Search for Web Services”, VLDB, 2004.5. [Platzer’05] Platzer, C.; Dustdar, S.: “A vector space search engine for Web
services”, Proceedings of IEEE European Conference on Web services (ECOWS), 2005.
6. [Stroulia’05] Stroulia, E., Wang, Y.: ”Structural and Semantic Matching for Accessing Web Service Similarity”, International Journal of Cooperative Information Systems, Vol. 14, No. 4, 2005, pp. 407-437.
7. [Wu’05] Wu, J., Wu, Z.: ”Similarity-based Web Service Matchmaking”, IEEE International Conference on Services Computing, 2005, pp. 287-294.
8. [Zhuang’05] Zhuang, Z., Mitra, Pr., Jaiswal, A.: ”Corpus-based Web Services Matchmaking”, AAAI, 2005.
9. [Verma’05] Verma, K., Sivashanmugam, K., et al.: “Meteors wsdi: A scalable p2p infrastructure of registries for semantic publication and discovery of web services.” Journal of Information Technology and Management. Special Issue on Universal Global Integration, Vol. 6, No.1, 2005, pp. 17-39.
20/10/2006 University of Lecce 23
Hybrid algorithms
Hybridization
AlgorithmsData
AugmentationMixed
Cascade
Switching
Combination
20/10/2006 University of Lecce 25
Future work
□Hybrid algorithms□Rocha, C. et al.: “A Hybrid Approach for
Searching in the Semantic Web”, International World Wide Web Conference, 2004, pp. 374-383)
□Castells, P., Fernandez, M., Vallet, D.: “An Adaptation of the Vector-Space Model for Ontology-Based Information Retrieval”, IEEE Transactions on Knowledge and Data Engineering, 2007, to appear.
□Empirical evaluation of different algorithms using a similar collection of web services
20/10/2006 University of Lecce 26
Related work
□ Syeda-Mahmood, T., Shah, G., et al.: “Searching service repositories by combining semantic and ontological matching”, International Conference on Web Services, 2005, pp. 13-20.
“(1) The domain-independent relationships are derived using an English thesaurus… (2) The domain-specific ontological similarity is derived by inferencing the semantic annotations associated with web service descriptions.
…better relevancy results can be obtained for service matches from a large repository, than could be obtained using any one cue alone.”
□ Klusch, M. Fries, B., Sycara, K.: “Automated Semantic Web Service Discovery with OWLS-MX”, AAMAS, 2006.
“…under certain constraints logic based only approaches to OWLS service I/O matching can be significantly outperformed by hybrid ones.”
20/10/2006 University of Lecce 27
Composition Patterns
□Sequence □Loop □AND split followed by AND join. □AND split followed by a m-out-of-n
join □XOR split followed by a XOR join□OR split followed by OR join□OR split followed by a m-out-of-n join
20/10/2006 University of Lecce 28
NotationSequential operator s1 ; s2
Parallel operator s1 | s2
Choice operator s1 + s2
Web service si
Start state t0
End state t
Quality parameters q(si)
s1
+
|
+n
|n
20/10/2006 University of Lecce 29
Behavioral Interface matching
• How to obtain a (composite) service, if there is no direct match for a request in current service registry?
□Behavioral interface - interfaces that capture ordering constraints between interactions.
□BPEL4WS – Business Process Execution Language for Web services
20/10/2006 University of Lecce 30
Interface transformationA BFlow
A
BGather C
AB
ScatterC
A BiBurst
Ai BCollapse
A Hide
Source interface
Targetinterface
20/10/2006 University of Lecce 31
References
□ Robert J. Hall and Andrea Zisman, “Behavioral Models as Service Descriptions”, ICSOC, 2004.
□ Dumas, M., Spork, M., Wang, K. : “Algebra and Visual Notation for Service Interface Adaptation”, 4th International Conference on Business Process Management (BPM), 2006.
□ Benatallah, B., Hacid, M-S., Leger, A., Rey, K., Toumani, F.: ”On automating Web services discovery”, VLDB Journal, N 14, 2005, pp. 84–96.
□ Lang, Q.A., Su, St. Y.W : "AND/OR Graph and Search Algorithm for Discovering Composite Web Services", International Journal of Web Services Research, 2(4), 46-64, 2005.
20/10/2006 University of Lecce 32
Web Services Discovery
□Matching – meeting the functionality required by a user with specifications of existing services□Generic (heuristics, domain-independent
ontologies)□Personal (preferences, specific comparison
functions) □Community (domain-specific ontologies)
□Selection – choosing a service with the best quality among those able to satisfy the user’s goal
20/10/2006 University of Lecce 33
QoS Issues
□ How to define QoS? □ complexity of run-time QoS information □ dependencies among different QoS parameters
□ How to specify user preferences? □ How to match user requirements with existing services in
terms of QoS?□ How to perform ranking of similar services w.r.t. to user
preferences?□ How to predict QoS factors under certain environmental
conditions. □ dependencies among different QoS parameters □ relations with contextual factors
□ The same questions for composite web services
20/10/2006 University of Lecce 34
QoS characteristics
□Multidimensionality: □Different QoS driven web service
selection algorithms
□Subjectivity: □dependence on context, consumer, etc.□QoS run-time monitoring and analysis
on user side is required.
20/10/2006 University of Lecce 35
QoS parametersThroughput The number of requests served in a given time
period.
Capacity A limit of concurrent requests for guaranteed performance.
Latency The round-trip time between client request and service response.
Response time (duration)
The time taken by a service to process its sequence of activities.
Availability The probability that a service is available.
Reliability Stability of a service functionality, i.e., ability of a service to perform its functions under stated conditions
Reputation The average rate of the service reported by clients.
Execution cost (price)
The amount of money for a single service execution.
20/10/2006 University of Lecce 36
Linear programming approach[Zeng et al. 2004]
Scaling
Weighting
Linear combination of:□ price□ duration□ reputation□ success rate□ availability
where Wj are user preferences
20/10/2006 University of Lecce 37
Some questions1. Scaling:
• Availability - 100%• 0 – 0• 100% – 1
• Response time:• 0 – 1• timeout - 0
2. Objective function: • Linear combination - ?• Can we rely on the preferences defined by a user?
3. Which service is better: • Cheap but not reliable,• Reliable but expensive?
4. A service failed but the task should be accomplished• Structure of a redundant composition graph
20/10/2006 University of Lecce 38
New WS selection algorithm□ Notation:
□ c – composition□ q(si) – quality parameter (response time, execution cost)□ p(si) – probability of success□ qmax – resource limit
□ Time vs. cost:□ The basic approach is to take the less important
parameter as objective function provided that the most important criterion meets some requirements.
where
20/10/2006 University of Lecce 40
Example□ Goal: Translate a document from Belarusian to
Turkish□ Available web services:
□ Belarusian – English (b-e)□ Belarusian – German (b-g)□ German – Turkish (g-t)□ English – Turkish (e-t)□ German – English (g-e)
□ WS compositions that can satisfy the user’s goal:□ Belarussian – English – Turkish □ Belarussian – German – Turkish □ Belarussian – German – English – Turkish
20/10/2006 University of Lecce 42
Risk management
□ Requires assessment of inherently uncertain events and circumstances
□ Two dimensions: □ how likely the uncertainty is to occur (probability)□ what the effect would be if it happened (impact)
□ Example:□ Movie: title=Rainmaker, format=DVD,
languages=Italian, English□ Convert DVD to AVI: language=English□ SimpleDivX converter: time=2 hours, language = Italian□ Impact on time: 2 hours are lost
20/10/2006 University of Lecce 43
Failure riskFailure risk – considers the probability that some fault will occur and
the resulting impact of this fault on the composite service
where is the probability of the service failure.
Loss function – defines the cost of service failure (money, time, resources)
20/10/2006 University of Lecce 44
Scenario
s2
s1
++
s3
s5
+s4+
End-user
Provider
Partners
s0
□ Service failures□ Service changes□ Violations of Service
Level Agreements (SLAs)□ Absence of alternative
solutions (penalties)
Invoke
20/10/2006 University of Lecce 45
Failure risk: examplep=0.5 cost = 1 penalty = 2
1.75
1.375
1.5625
1.625
1.25
s1 s2
s1 s2
s3
+ +s4
s2s1
s3
+ +
s1
s3
+ +s2
s4
++
s1s2
s3
++
20/10/2006 University of Lecce 46
Related Work1. [Zeng 2004] Zeng, L., Benatallah, B., et al.: ”QoS-aware Middleware for Web
Services Composition”, IEEE Transactions on Software Engineering, Vol. 30, No. 5, 2004, pp. 311–327.
2. [Ardagna 2005] Ardagna, D., Pernici, B.: ”Global and Local QoS Constraints Guarantee in Web Service Selection,” IEEE International Conference on Web Services, 2005, pp. 805–806.
3. [Yu 2005] Yu, T., Lin, K.J.: ”Service Selection Algorithms for Composing Complex Services with Multiple QoS Constraints”, International Conference on Service-Oriented Computing, 2005, pp. 130–143.
4. [Claro 2005] Claro, D., Albers, P., Hao, J-K.: “Selecting Web Services for Optimal Composition”, Proceedings of the ICWS 2005 Second International Workshop on Semantic and Dynamic Web Processes, 2005, pp. 32-45.
5. [Canfora 2006] Canfora, G., di Penta, M., Esposito, R., Villani, M.-L.: “QoS-Aware Replanning of Composite Web Services”, Proceedings of the International Conference on Web Services, 2005.
6. [Martin-Diaz 2005] Martin-Diaz, O., Ruize-Cortes, A., Duran, A., Muller, C.: ”An Approach to Temporal-Aware Procurement of Web Services”, International Conference on Service-Oriented Computing, 2005, pp. 170–184.
7. [Bonatti 2005] Bonatti, P.A., Paola Festa, P.,: “On Optimal Service Selection”, Proceedings of the 14th international conference on World Wide Web , 2005, pp. 530-538.
8. [Lin 2005] Lin, M., Xie, J., Guo, H., Wang, H.: “Solving QoS-driven Web Service Dynamic Composition as Fuzzy Constraint Satisfaction, IEEE International Conference on e-Technology, e-Commerce and e-Service, 2005, pp. 9-14.
9. [Gao 2006] Gao, A., Yang, D., Tang, Sh., Zhang, M.: “QoS-driven Web Service Composition with Inter Service Conflicts”, APWeb: 8th Asia-Pacific Web Conference, 2006, pp. 121 – 132.
20/10/2006 University of Lecce 47
Quality of Service Issues
□Multidimensionality: □QoS driven WS selection algorithms
□Subjectivity: □dependence on context, consumer, etc.□QoS run-time monitoring and analysis
on user side is required.
20/10/2006 University of Lecce 48
How to define QoS parameters?
□ Advertised by providers:□ Simple (popular)□ Providers may not advertise QoS information □ Providers are not able to predict QoS in a neutral manner □ Providers are interested in overstating the real QoS □ Providers do not intend to revise constantly advertised QoS□ Not effective and trust-aware.
□ Monitored on the client side: □ active monitoring and/or explicit user feedback (ratings) □ high computational overheads
□ Evaluated by a third party: □ specialized unbiased agency □ tests web services and publishes QoS data□ expensive and static
□ Hybrid
20/10/2006 University of Lecce 49
QoS Sources
ProviderClient
AgencyMonitoring Feedback
Reliable - + - +
Objective - + - +
Up-to-date - + + -
Free of charge + + + -Low computational overheads
+ - - +
20/10/2006 University of Lecce 50
Recommendation Systems (RS)
□Examples□Movies (MovieLens), □Music (JUKEBOX), □Books (Amazon), □Hotels, resorts and vacations
(TripAdvisor)□Types:
□Content-based Filtering□Collaborative Filtering□Hybrid
20/10/2006 University of Lecce 51
Content-based Filtering
□Recommendations are based on information on the content of items rather than on other users’ opinions □A buys books on economics □B is not interested in computer science
□Use machine learning/text mining algorithms to create user profiles about user preferences from examples based on a description of content
20/10/2006 University of Lecce 52
Content-based Filtering
□Problems:□Requires content that can be
transformed into a list of features□Users’ tastes must be represented as a
function of these features□Unable to exploit quality judgments of
other users
20/10/2006 University of Lecce 53
Collaborative Filtering
□ Users explicitly assign ratings to items
□ Predict rating of a user U for an item I:
1. Find users similar to U (neighbors) 2. Calculate rating of user U to item I as
weighted sum of ratings given by neighbors to item I
20/10/2006 University of Lecce 54
Collaborative Filtering
□ Problems:□ Cannot recommend new items (first-rater problem)
□Random: Choose item randomly with equal probabilities□Content analysis: Apply previously described approach if
cannot find similar users□Filterbots (programs simulating users): Constantly do
searches and rate items using some primitive algorithms□ Cannot match new users: they have rated nothing (cold
start problem)□Provide average ratings□User agents collect implicit ratings□Put users in categories □Select items for users to rate
20/10/2006 University of Lecce 55
From recommendations to decision making
□Define a problem□Return ticket from Trento to Lecce
□Identify alternatives□By train:
□price: 100€, duration: 26 hours, personal comfort: high□By plain (Venezia – Brindisi):
□price: 250€, duration: 14 hours, personal comfort: low
□Make the choice□Train
□Explain the decision□It is much cheaper
20/10/2006 University of Lecce 56
Implicit Culture
□Provide actors with suggestions based on behavioural patterns extracted from history of actions
□Community has knowledge specific to the environment: community culture
□Encourage a newcomer to behave according to community culture
□http://www.dit.unitn.it/~implicit
20/10/2006 University of Lecce 57
Implicit Culture Definitions
□ Action – something that can be done□ Agent (actor) – somebody or something performing an
action□ Object – something that passively participate in the
action□ Situation – a state of the world faced by the agent.
Includes a set of objects and a set of possible actions□ Culture – a usual behavior of the group of agents□ Group G – group of agents which behaviour is observed □ Group G' – group of agents who require recommendations□ Implicit Culture relation – situations in which agents of the
group G behave similarly to agents of the group G'□ System for Implicit Culture Support (SICS) – a system which
tries to establish IC relation
20/10/2006 University of Lecce 58
System for Implicit Culture Support
D B o f o bs e rv a tio ns
Inductiv e M o dule
C ultura l A c tio n F ind er
S ceneP ro d ucer
C o mpo s e r
th e o ry
a g e n ts , o b je c ts , a c t io n s , s c e n e s
s c e n e s
d o ma in th e o ry
o b s e rv a t io n s
o b s e rv a t io n s
s c e n e s
O bs e rv e r
a g e n ts , o b je c ts , a c t io n s , s c e n e s
Stores information
about actions
Produce a theory about common user
behaviorProduce
recommendation about
action
20/10/2006 University of Lecce 59
SICS Architecture
CoreAOP
Helpers
SIC
S C
ore
Co
nfi
gu
rati
on
an
d
Sto
rag
e L
aye
r
Configuration Module
Rule Storage Module
Storage Module
SIC
S L
aye
rComposer Adapters
Composer
Inductive Module
Application
SIC
S R
em
ote
Cli
en
tS
ICS
Re
mo
te M
od
ule
R e mote M oduleAO P H e lpe rs
ExceptionManager
LoggingService SICS Adapters
Spring Proxies/Adapters
AxisEJB
Seria lizableObjects
Over RMI
Seria lizableObjects
Over SOAP
Seria lizableObjects
Remote C lient Adapters
Spring Proxies/Adapters
R e mote M oduleAO P H e lpe rs
ExceptionManager
LoggingService
Seria lizableObjects
IC-Serv ice
SIC
S C
ore
□ The IC-Service is implemented in java and uses some libraries
□ Can be used in an application in a number of ways:□ As a java library□ As an EJB component in
J2EE environment□ As a web service
□ Observations are stored in XML files or in database
20/10/2006 University of Lecce 60
Composer & Inductive Module
C omposer U tilities
C omposer Implementation
C AF U tilitiesSimilarity U tilities
Inductive Module Apriori Implementation
Apriori R ulesGenerator
Apriori AlgorithmOther algorithms
Instance Configuration
Inductive Module Configuration
Inductive ModuleConstants
Composer Configuration
ComposerConstants
Configuration OfSimilarity Functions
XML DefinitionLoader Sim ple Class
W rapper
XML file
Exploits the observations and thetheory in order to suggest actions in a
given situation.
Analyzes the stored observations and applies data mining techniques to find a theory about the community culture
20/10/2006 University of Lecce 61
IC for Web Service Selection
<respond>
<develop><invoke><respond>
<register>
<query>
<report>
<recommend>
Web ServiceWeb Service Proxy(Axis)
Proxy(Axis) ApplicationApplication
Registry
Community
IC-ServiceIC-Service
<feedback>
□How to select a web service with high quality suitable for your problem?□History-based selection□Quality of Service = Quality of Experience
20/10/2006 University of Lecce 62
Observation of web service invocations
□ Actors:□ Applications (application name, user name, location)□ Users (user name, location)
□ Objects:□ Operation (operation name, web service name, category)□ Inputs/Outputs (parameter name, parameter value)□ Requests (operation names, input/output parameters,
category)□ Actions:
□ Bind (timestamp, web service), □ Invoke (timestamp, operation, input),□ Get response (timestamp, operation, output, response time), □ Raise exception (timestamp, operation, exception type, input), □ Provide feedback (report about contract violations, domain-
specific QoS parameters), □ Submit query (request, preferences)
20/10/2006 University of Lecce 63
Example of the theory rules
□Observations:□submit(A, newRequest(category=“currency”);…)□invoke(A, {http://www.webserviceX.NET/}
CurrencyConvertor & conversionRate;…)
□Theory rules:□submit(_U; _Q; …) invoke (_Y, *(category =
extract(category, _Q))); …)
□Request:□submit(newClient;
newRequest(category=currency); …)
20/10/2006 University of Lecce 64
Future work
□Empirical evaluation□Customizable similarity evaluation□Other mining algorithms□Enrich SICS with semantic matching:
□Hierarchy of actions, objects, attributes, etc.
20/10/2006 University of Lecce 65
Related work
□ [Blanzieri01] Blanzieri, E., Giorgini, P., Massa, P., Recla, S.: “Implicit culture for multi-agent interaction support”, Proc. of the Int. Conf. on Cooperative Information Systems, 2001, pp. 27-39.
□ [Maximilien04] Maximilien, E.M., Singh, M.P.: A framework and ontology for dynamic web services selection. IEEE Internet Computing 8(5) (2004) 84-93
□ [Manikrao05] Manikrao, U. Sh., Prabhakar, T.V.: “Dynamic Selection of Web Services with Recommendation System”, International Conference on Next Generation Web Services Practices (NWeSP'05), 2005, pp. 117-121.
20/10/2006 University of Lecce 66
Further information
□ Kokash, N.: "A Comparison of Web Service Interface Similarity Measures", Proceedings of STAIRS'06, Riva del Garda, Italy, August 2006, pp. 220--231, full paper. Extended version: Technical Report No DIT-06-025, April 2006, University of Trento, Italy.
□ Kokash, N., Van den Heuvel, W.-J., D'Andrea, V.: "Leveraging Web Services Discovery with Customizable Hybrid Matching", Proceedings of ICSOC, Chicago, December 2006, short paper, to appear. Extended version: Technical Report No DIT-06-042, July 2006, University of Trento, Italy.
□ Kokash, N.: "A Service Selection Model to Improve Composition Reliability", International Workshop on AI for Service Composition, in conjunction with ECAI'06, Riva del Garda, Italy, August 2006, pp. 9--14, full paper.
□ Birukou, A., Blanzieri, E., D'Andrea, V., Giorgini, P., Kokash, N., Modena, A.: "IC-Service: A Service-Oriented Approach to the Development of Recommendation Systems", The ACM Symposium on Applied Computing, Special Track on Web Technologies (WT), March 2007, to appear. Technical Report No DIT-06-044, July 2006, University of Trento, Italy.
□ http://dit.unitn.it/~kokash/