A BROKER-BASED WEB SERVICE ARCHITECTURE FOR SERVICE REGISTRATION
Transcript of A BROKER-BASED WEB SERVICE ARCHITECTURE FOR SERVICE REGISTRATION
International Journal of Emerging Technology and Advanced Engineering
Website: www.ijetae.com (ISSN 2250-2459 (Online), An ISO 9001:2008 Certified Journal, Volume 3, Special Issue 1, January 2013)
International Conference on Information Systems and Computing (ICISC-2013), INDIA.
Sri Sai Ram Engineering College, An ISO 9001:2008 Certified & NBA Accredited Engineering Institute, Chennai, INDIA. Page 1
A BROKER-BASED WEB SERVICE ARCHITECTURE
FOR SERVICE REGISTRATION AND DISCOVERY WITH
QOS Dr. T.Rajendran
Professor and Dean, Department of CSE &IT, Angel College of Engineering and Technology, Tirupur, India.
Email: [email protected]
Abstract
The increasing number of Web services over the web makes the consumer to apply tools to search for appropriate Web
services available though out the world. Web service technology provides a way for simplifying inter-operability among different
platforms. There exist many Web services which reveal similar functional characteristics; non-functional properties (QoS) have
become essential criteria to enhance the discovery and registration process of services. The purpose of web service discovery is to
choose best possible web service for a particular task. When dynamic discovery is used in Web Services, it is common that the
result of the discovery contains more than one web service provider. In this paper, we proposed a novel approach for designing
and developing a broker-based architecture (WSB) and its QoS-based matching, ranking and selection algorithm for evaluating
and discovering the best possible Web services for a specific task. We explained the underlying ideas of our approach and
experimental results have shown the effectiveness of our approach.
Keywords--- Broker-Based, Web Service Architecture, Service Registration, Service Discovery, Quality of Services (QoS).
I. INTRODUCTION
Service-Oriented Architecture (SOA) is an
architectural approach for constructing complex
software-intensive systems from a set of universally
interconnected and interdependent building blocks,
principles used during the phases of systems
development and integration. A deployed SOA-based
architecture will provide a loosely-integrated suite of
services that can be used within multiple business
domains. A SOA is essentially a collection of services.
These services communicate with each other.
The communication can involve either simple data
passing or it could involve two or more services
coordinating some activity. Some means of connecting
services to each other is needed. Web service is a
software system designed to support interoperable
machine-to-machine interaction over a network. Web
Services interact with each other, fulfilling tasks and
requests that, in turn, carry out parts of complex
transactions or workflows. If multiple Web services
provide the same functionality, then Quality of Service
(QoS) requirements can be used as a secondary criterion
for service selection.
QoS is a set of non-functional attributes like service
response time, throughput, reliability, and availability
[1], [2]. The current Universal Description, Discovery
and Integration (UDDI) registries only support Web
services discovery based on the functional aspects of
services [1].
The problem, for that reason, is firstly to accommodate
the QoS information in the UDDI, and secondly to
guarantee some extent of authenticity of the published
QoS information. QoS information published by the
service providers may not always be accurate and up-to-
date.
There are two major problems in using QoS for Web
service discovery. First is the specification and storage of
the QoS information, and second is the specification of
the customer‟s requirements and matching these against
the information available. Major efforts in this area
include Web Services Level Agreements (WSLA) [3] by
IBM, Web Services Policy Framework (WS Policy) [4],
and the Ontology Web Language for Services (OWL-S)
[5]. Most of these efforts represent a complex framework
focusing not only on QoS specifications, but on a more
complete set of aspects relating to Web services. Some
researchers propose other simpler models and approaches
[6]-[8] for dynamic Web services discovery. However,
they all struggle with the same challenges related to QoS
publishing and matching.
Nowadays, both Web services providers and clients
are concerned with the QoS guaranteed by Web services.
From the client point of view, Web service based QoS
discovery is a multi-criteria decision mechanism that
requires knowledge about the service and its QoS
description. However, most of the clients are not
experienced enough to acquire the best selection of Web
service based on its described QoS characteristics.
International Journal of Emerging Technology and Advanced Engineering
Website: www.ijetae.com (ISSN 2250-2459 (Online), An ISO 9001:2008 Certified Journal, Volume 3, Special Issue 1, January 2013)
International Conference on Information Systems and Computing (ICISC-2013), INDIA.
Sri Sai Ram Engineering College, An ISO 9001:2008 Certified & NBA Accredited Engineering Institute, Chennai, INDIA. Page 2
They simply trust the QoS information published by
the provider; however, most of the Web services
providers do not guarantee and provide assurance to the
level of QoS offered by their Web services. Based on the
above, we propose a Web services discovery architecture
that contains an extended UDDI to accommodate the
QoS information, and Web Service Broker (WSB) to
facilitate the service discovery.
We develop a service matching, ranking and selection
algorithm based on a matching algorithm proposed by
Maximilien and Singh [9]. Our algorithm finds a set of
services that match the consumer‟s requirements, ranks
these services using their QoS information and feedback
rating, and finally returns the top M services (M indicates
the maximum number of services to be returned) based
on the consumer‟s preferences in the service discovery
request.
QoS delivered to a client may be affected by many
factors, including the performance of the Web service
itself, the hosting platform and the underlying network. A
set of verification procedures is essential for providers to
remain competitive and for clients to make the right
selection and trust the published QoS metrics. For the
success of any QoS based Web services architecture, it
should support a set of features: 1) QoS Verification and
Certification to guide Web services discovery and 2) QoS
aware Web services publishing and discovery. In this
paper, we propose a broker-based architecture for Web
service registration and discovery with QoS
characteristics. The role of the Web Service Broker
(WSB) is to support QoS provisioning and assurance in
delivering Web services. It implements the concept of
Quality Analysis, and QoS verifying and certifying
process. The goal of this research is to investigate how
dynamic Web service discovery can be realized to satisfy
a customer‟s QoS requirements using a new architecture
that can be accommodated within the existing basic Web
service protocols.
The rest of the paper is organized as follows. Section 2
outlines the related research conducted in the area of
Web Services Discovery, QoS and reputation. In Section
3, the proposed Web Service Broker architecture for Web
service registration and discovery is analysed and
discussed. Section 4 presents experimental results which
evaluate the effectiveness of our architecture. Section 5
concludes the paper and presents possible future research
work.
II. RELATED WORK
Researchers have proposed various approaches for
dynamic Web service discovery. Maximilien and Singh
[9] proposed a multi-agent based architecture to select
the best service according to the consumers‟ preferences.
Blum [10] proposes to extend the use of Technical
models (tModels), within the UDDI to represent different
categories of information such as version and QoS
information. Ran [1] proposes an extended service
discovery model containing the traditional components:
service provider, service consumer, and UDDI registry,
along with a new component called a Certifier. The
Certifier verifies the QoS of a Web service before its
registration.
The consumer can also verify the advertised QoS with
the Certifier before binding to a Web service. Although
this model incorporates QoS into the UDDI, it does not
integrate consumer feedback into the service discovery
process. However, it lacks support for the dynamism of
Web services.
Crasso et al. [31] discussed eight shortcomings in the
present Web service description and have termed them as
„anti-patterns‟ which prevent the efficient discovery of
the services. They have experimentally shown that
discovery is more accurate if all such „anti-patterns‟ are
removed. Different domains and applications may require
different QoS properties; therefore we need a more
efficient and flexible method to express QoS information
[32].
Mydhili and Gopalakrishna [33] had given an
exhaustive review summarizing the results, observations,
and findings of the renowned researchers in the domain
of Web Service Discovery. They highlighted the list of
problems, which need to be looked into and investigated.
The comprehensive listing of the open-ended unresolved
issues was presented in a novel way, by providing its
Cause-Effect Analysis.
Shrabani Mallick and Kushwaha [11] proposed Web
service discovery architecture based on x-SOA, organizes
the method of Web service discovery in an efficient and
structured manner using an intermediate, Request
Analyser (RA), between service provider and service
consumer. Algorithm in Shrabani Mallick and Kushwaha
[11], LWSDM, works for a complete cycle of Web
service discovery.
Majithia et al. [12] proposed a framework for
reputation-based semantic service discovery. Ratings of
services in different contexts are collected from service
consumers by a reputation management system. T.
Rajendran and P. Balasubramanie [13] showed a
framework for agent-based Web services discovery with
QoS to select the suitable Web service that satisfies the
clients‟ preferences and QoS constraints. It contains an
extended UDDI to accommodate the QoS information.
IBM proposes Web Service Level Agreements (WSLA),
which is an XML specification of SLAs for Web services
focusing on QoS Constraints [14]. Many of these
approaches do not provide guarantees as to the accuracy
of the QoS values over time or having up-to-date QoS
information.
International Journal of Emerging Technology and Advanced Engineering
Website: www.ijetae.com (ISSN 2250-2459 (Online), An ISO 9001:2008 Certified Journal, Volume 3, Special Issue 1, January 2013)
International Conference on Information Systems and Computing (ICISC-2013), INDIA.
Sri Sai Ram Engineering College, An ISO 9001:2008 Certified & NBA Accredited Engineering Institute, Chennai, INDIA. Page 3
Farkas and Charaf [15] proposed software architecture
to provide QoS-enabled Web services by adding a QoS
broker between clients and service providers to discover
the QoS aware services in UDDI. However, no detailed
and accurate information about QoS broker, such as how
it is designed and the functionality of it is presented so
far.
UDDI extension to support QoS-enriched service
publication and discovery has generated several research
efforts.
Shaikh Ali‟s approach [16] was based on the extension
of the UDDI business service structure, but potential QoS
changes are not considered. Chen et al. [17] proposed a
registry that receives reports made by consumers to
generate QoS summaries for invoked Web services.
Kalepu et al. [18] evaluated the reputation of a service as
a function of three factors: ratings made by users, service
quality compliance, and the changes of service quality
confirmance over time. However, these solutions do not
take into account the trustworthiness of QoS reports
produced by users, which is important to assure the
accuracy of the QoS-based Web service selection and
ranking results. Liu et al. [19] proposed an approach for
rates services computationally in terms of their quality
performance from QoS information provided by
monitoring services and users. The authors also employ a
simple approach of reputation management by
identifying every requester to avoid report flooding. In
Diego Zuquim Guimaraes Garcia and Maria Beatriz
Felgar de Toledo [20], an extended Web service
architecture to support QoS management was explained.
The architecture is currently being integrated with
Business Process Management (BPM) Technology. The
major contributions are: Extending the WS policy
framework to specify QoS policies for Web services,
extending the UDDI information model and API set to
refine service discovery and using tModels to define QoS
related concepts.
Tian et al. [21] showed a WS-QoS architecture that
enables QoS-aware service specifications as well as the
broker based Web service selection model that enables an
efficient QoS-aware service selection. Eyhab Al-Masri
and Qusay H. Mahmoud [22] introduced a mechanism
that extends the Web Services Repository Builder
(WSRB) of Web Services. It also introduced the Web
Service Relevancy Function (WsRF) used for measuring
the relevancy ranking of a particular Web service based
on client‟s preferences and QoS metrics. Xu et al. [23]
provided a Web service discovery model that contains an
extended UDDI to accommodate the QoS information, a
reputation management system to build and maintain
service reputations and a discovery agent to facilitate
service discovery. A service matching, ranking and
selection algorithm is also developed.
Demian Antony D‟ Mello et al. [24] explored different
types of requester‟s QoS requirements and a tree model
for requester‟s QoS requirements. It also proposed a QoS
broker based Web service architecture which facilitates
the requester to select a suitable Web service based on
QoS requirements and preferences. The Web service
selection and ranking mechanism uses the QoS broker
based architecture [26]. The QoS broker is responsible
for the selection and ranking of functionally similar Web
services.
The Web service selection mechanism [26] ranks the
Web services based on prospective levels of satisfaction
of requester‟s QoS constraints and preferences. Serhani
[27] proposed Web service architecture employs an
extended UDDI registry to support service selection
based on QoS, but only the certification approach is used
to verify QoS and no information is provided about the
QoS specification.
Hongan Chen et al. [28] presented a description and an
implementation of broker-based architecture for
controlling QoS of Web services. The broker acts as an
intermediary third party to make Web services selection
and QoS negotiation on behalf of the client. Delegation
of selection and negotiation raises trustworthiness issues
mainly for clients. Performance of the broker is not
considered in this approach. Moreover, performance of
the broker can be a key to the success of any proposed
architecture; if the user does not get a response to his/her
request with an acceptable response time, he/she will
switch to another provider. Wishart [29] offered a
protocol for service reputation. An old factor for the
reputation mark is used to each of the ranks for a service
so current ranks are more important than aged ones. The
value of the aged factor should rely on the requirements
of the service consumers and the reputation management
system. Hu [30] showed the Web service is selected by
matching requested QoS property values against the
potential Web service QoS property values. In this
literature, the Web service is selected by taking the
requester‟s average preference for QoS properties. Using
FIPA compliant Multi Agents, Jaleh Shoshtarian Malak
[34] proposed a Multi Agents based Web service QoS
Management Architecture. A QoS based Web service
clustering technique was introduced which helped in
choosing a service that best suited user quality
preferences. Many of these approaches do not provide
guarantees as to the accuracy of the QoS values over time
or having up-to-date QoS information. In the next
section, we describe the design of the proposed Web
Service broker-based architecture which overcomes
many of the limitations imposed by current discovery
model.
International Journal of Emerging Technology and Advanced Engineering
Website: www.ijetae.com (ISSN 2250-2459 (Online), An ISO 9001:2008 Certified Journal, Volume 3, Special Issue 1, January 2013)
International Conference on Information Systems and Computing (ICISC-2013), INDIA.
Sri Sai Ram Engineering College, An ISO 9001:2008 Certified & NBA Accredited Engineering Institute, Chennai, INDIA. Page 4
III. WSB WEB SERVICE ARCHITECTURE WITH QOS
The purpose of Web service discovery is to select
most suitable Web service for a particular task. When
dynamic discovery is used in Web services, it is common
that the result of the discovery contains more than one
provider. We propose a technique for dynamic discovery
of Web services which will also handle the problem of
redundant Web services.
The architecture consists of the basic Web service
model components like the Web service provider, Web
service consumer and the UDDI registry.
In addition, UDDI registry has the capability to store
QoS information using tModel data structure and a WSB.
Components of the architecture are presented in Fig. 1.
We propose a Web services discovery architecture which
contain an extended UDDI to accommodate the QoS
parameters [23]. The WSB assists clients in selecting
Web services based on a set of QoS parameters. The
WSB has five components: Service Publisher [24],
Verifier and Certifier, Retrieval Agent, Quality Analyzer,
and Web Service Storage (WSS) [25]. Broker services
may be used to facilitate service registry access. The
broker performs the interaction with the UDDI.
Fig. 1. Architecture for WSB
The broker is a Web service performing a collection of
QoS functionalities. It is the entity that performs the
verification and certification tasks. It is also involved in
other operations, such as registering and selecting
services with QoS functions.
To overcome many of the limitations imposed by
current discovery model, we introduce the Web Service
Broker architecture shown in Fig. 1.
The service publisher component facilitates the
registration, updating and deletion of Web service related
information. It gets the business- and performance-
specific QoS property values of Web services from the
service providers. The service provider publishes its
service functionality to the UDDI registry through the
service publisher after certification and verification. For
every service or group of services there exists a service
publisher that handles all communication with registries,
bindings, negotiations, requests, and responses for that
service. The service consumer can search the UDDI
registry for a specific service through the retrieval agent.
The main functionality of the retrieval agent
component is to select the most suitable Web service
satisfying requester‟s QoS constraints and preferences,
along with service functionality. The service requester
can verify the advertised QoS with the retrieval agent
before binding to a Web service. QoS verification is the
process of validating the correctness of information
described in the service interface as well as the described
QoS parameters. The result of verification will be used as
input for the certification process that will be issued
when the verification succeed. The QoS property values
obtained from the service providers are verified and
certified by the Verifier and Certifier component before
registering them into the UDDI registry. The Verifier and
Certifier component is implemented within the WSB and
is responsible for certifying Web services and their
provided QoS. A certificate is sent to the Web service
provider and a copy is stored in the WSS for future use.
A sequence of interaction between these components is
presented in Fig. 2.
A typical usage scenario (Fig. 2) is described here by
considering an example in which a consumer uses a Web
service in its application.
1. Initially Web Service Broker (WSB) publishes the
interface to the UDDI registry.
2. Web service provider finds the broker interface in
UDDI registry.
3. The service provider registers the Web service with
the service publisher which is available in the WSB
and provides functional and non-functional
information about the offered services.
4. The Verifier and Certifier component of the WSB
verifies the QoS information and issues a
certificate.
5. A copy of the QoS certificate is stored in WSS and
a copy is sent to the service provider.
6. The service publisher then publishes the Web
service in the UDDI registry along with the QoS
certificate.
7. The consumer application requests service
discovery and provides functional and QoS
requirements.
International Journal of Emerging Technology and Advanced Engineering
Website: www.ijetae.com (ISSN 2250-2459 (Online), An ISO 9001:2008 Certified Journal, Volume 3, Special Issue 1, January 2013)
International Conference on Information Systems and Computing (ICISC-2013), INDIA.
Sri Sai Ram Engineering College, An ISO 9001:2008 Certified & NBA Accredited Engineering Institute, Chennai, INDIA. Page 5
8. The retrieval agent finds the service in the UDDI
registry according to the required service
functionality and QoS requirements of the
application.
9. The retrieval agent can communicate with quality
analyzer to verify the provided QoS certificate and
rating scores with the one stored in the WSS.
10. The retrieval agent then reports the discovered
service back to the application (consumer).
11. The Web service consumer then binds the Web
service from the service provider.
12. Consumer sends the feedback to the quality
analyzer after invoked the service.
13. Quality analyzer calculates the rating scores for
consumer feedback then stores it into database
(WSS) for service discovery process.
1.1 Service Publisher
The service publisher component communicates with
the service provider and the UDDI registry. The service
provider registers the business and Web service related
information with the service publisher. It also gets the
specific QoS property values of Web services from
providers. Once the QoS property values and other
information are obtained from the provider it is presented
to the Verifier and Certifier component. The QoS
information is verified and certified before publishing it
in the UDDI registry.
Fig. 2. Architectural Component Interactions
1.2 Verifier and Certifier
This is the key component of the WSB that performs
the verification of the QoS information supplied by the
service provider and issues a certificate to the service
provider through the service publisher. This QoS
certificate assures that the QoS offered by the provider
confirm to their descriptions. The service provider
initiates the verification process through the service
publisher by supplying the QoS property values. The
verifier is provided with the WSDL document and
additional information about resources available at the
provider‟s platform. The verifier performs the testing of
the service URI, the XML schema definition, the service
binding information and the availability of all operations
described in the service interface. Verifier also performs
the verification of the QoS information introduced in the
service interface.
The QoS verification is conducted through a set of test
cases generated by the verifier. For each test, additional
information like server capacity, network bandwidth
about the provider and its Web service are needed. The
four QoS parameters (Response Time, Availability,
Throughput, and Price) are also verified. The verification
process is done in three levels: General Web services
information verification, WSDL content verification, and
QoS verification. A Web service is said to be compliant
with a given level when it passes the corresponding tests
described in the verification document. Based on this,
Web services can be classified into three classes. Class A
includes Web services for which all verification tests
have succeeded. Class B includes Web services for which
more than 80% of the verification tests have succeeded.
Class C contains the services for which most of the
verification scenarios have failed.
Once the verification process is completed
successfully, the certification process is initiated. The
certifier issues a certificate to the service provider
through the service publisher which indicates that the
offered QoS confirm to their descriptions. The main
responsibility of the certifier is to certify the Web
services and their provided QoS. A copy of the certificate
sent to the service provider, which is also stored in the
WSS for future use. The certificate includes information
such as certificate number, certificate issue date, number
of years in business, class type, and service location. In
case, if the certificate cannot be issued, feedback will be
sent to the provider. After the QoS certification process,
the service publisher can register the functional
description of the service and the certified QoS
information with the UDDI registry.
International Journal of Emerging Technology and Advanced Engineering
Website: www.ijetae.com (ISSN 2250-2459 (Online), An ISO 9001:2008 Certified Journal, Volume 3, Special Issue 1, January 2013)
International Conference on Information Systems and Computing (ICISC-2013), INDIA.
Sri Sai Ram Engineering College, An ISO 9001:2008 Certified & NBA Accredited Engineering Institute, Chennai, INDIA. Page 6
1.3 Retrieval Agent
The retrieval agent component is concerned with
selecting the most suitable Web service satisfying the
consumer‟s QoS constraints and the specific service
functionality. It receives messages from the Web service
consumer, specifying the service functionality along with
the QoS constraints. Based on the received requirements
specification, it discovers functionally similar Web
services from the UDDI registry. The retrieval agent can
check the validity of the QoS information in the UDDI
registry by comparing the QoS certificate provided by the
Verifier and Certifier with the one stored in the WSS.
1.4 Quality Analyzer
After a Web service task is finished, a client may
summarize the QoS experience and send them to the
quality analyzer in WSB. The quality analyzer collects
feedback regarding the service and QoS of the Web
services from the service consumers, calculates rating
scores, and updates these scores in the WSS.
The quality analyzer uses this information during the
service discovery phase. For this work, we assume that
all ratings are available, objective and valid. Service
consumers provide a „rating‟, indicating the level of
satisfaction with a service after each interaction with the
service. A rating is simply an integer ranging from 1 to
10, where 10 shows extreme satisfaction and 1 shows
extreme dissatisfaction. Every rating score includes
service ID, consumer ID, a timestamp, and rating value.
The service key in the UDDI registry of the service is
used as the service ID, and the IP address of the
consumer is used as the consumer ID. The quality
analyzer stores rating score for services in a table. An
example table is given in Table I. For example a
consumer with IP address "192.168.2.5" for a service
with key "67340054-e8x6-11d5-9ea8-90030254267054"
provide rating score "9" in timestamp "2011-03-01
10:15:09". Every consumer stores one rating for a service
and future rating score for that service replaces previous
rating. Only the most recent rating by a customer for a
service is stored in the database. New ratings from the
same customers for the same service replace older
ratings. The time stamp is utilized to specify the previous
factor of a special service rating. We use equation (1) in
order to calculate the feedback rating.
n
S
RS
n
i
i 1 (1)
RS means rating score for the service.
Where n is number of rating for the service.
Si means ith service rating for the service.
Table I
EXAMPLE RATINGS FOR SERVICES
Service ID Consumer ID Timestamp Rating
Score
67340054-e8x6-11d5-
9ea8-
90030254267054
192.168.2.5 2011-03-01
10:15:09
9
9521db61-eac1-42e4-
5ab0-1d87b8f1876a
192.168.4.75 2011-03-01
11:15:09
8
74154900-f0b0-11d5-
bca4-002035229c64
192.168.2.70 2011-03-02
11:25:21
7
9021cb6e-e8c9-4fe3-
9ea8-3c99b1fa8bf3
192.168.2.155 2011-03-02
12:15:35
6
1.5 QoS Matching, Ranking and Selection Algorithm
A Web service consumer sends a service discovery
request to the retrieval agent, which then contacts the
UDDI registry to find services that meet the customer‟s
functional and QoS requirements. A service is said to be a
“match” if it satisfies the customer‟s functional
requirements and its QoS information.
If no matched service is found by the matching
process, the retrieval agent returns an empty result to the
customer. If multiple services match the functional and
QoS requirements, the retrieval agent calculates a QoS
score for each matched service based on the dominant
QoS attribute specified by the customer, or on the default
dominant attribute, average response time. The best
service is assigned a score of 1, and the other services are
assigned scores based on the value of the dominant QoS
attribute. The top M services (M being the maximum
number of services to be returned as specified by the
customer) with the highest QoS scores are returned to the
customer. If M is not specified, one service is randomly
selected from those services whose QoS score is greater
than LowLimit.
/*Web services matching, ranking and selection algorithm */
1 findServices (functionRequirements, qosRequirements,
feedbackRequirements, maxNumServices)
{
// find services that meet the functional requirements
2 fMatches = fMatch (functionRequirements);
3 if QoS requirements specified {
// match services with QoS information
4 qMatches = qosMatch (fMatches, qosRequirements); }
5 else {
// select max number of services to be returned
6 return selectServices (fMatches, maxNumServices, "random"); }
7 if feedback requirements specified {
// matches with QoS and feedback information
8 matches = feedbackRank (qMatches, qosRequirements,
feedbackRequirements);
// select max number of services to be returned
9 return selectServices (matches, maxNumServices, "byQoS"); }
10 else {
// matches with QoS information
11 return selectServices (matches, maxNumServices, "byOverall");
}
}
Fig. 3. Service Matching, Ranking and Selection Algorithm
International Journal of Emerging Technology and Advanced Engineering
Website: www.ijetae.com (ISSN 2250-2459 (Online), An ISO 9001:2008 Certified Journal, Volume 3, Special Issue 1, January 2013)
International Conference on Information Systems and Computing (ICISC-2013), INDIA.
Sri Sai Ram Engineering College, An ISO 9001:2008 Certified & NBA Accredited Engineering Institute, Chennai, INDIA. Page 7
Fig. 3 shows a simplified version of our service
selection algorithm where the leftmost numbers denote
the line numbers. When the retrieval agent receives a
discovery request, it executes fMatch (line 2) which
returns a list of services LS1 that meet the functional
requirements. If QoS requirements are specified,
qosMatch (line 4) is executed next on the set of services
LS1 and it returns a subset of services LS2 that meet the
QoS requirements. selectServices (line 6, 9, 11) always
returns a list of M services to the customer where M
denotes the maximum number of services to be returned
as specified in the discovery request. If QoS requirements
are not specified, selectServices returns M randomly
selected services from LS1. If only one service satisfies
the selection criteria, it returns this service to the
customer.
IV. EXPERIMENTAL RESULTS AND DISCUSSION
This section presents experimental evaluation of the
proposed approach. The proposed Web service broker
based discovery system is implemented on Windows
platform using Microsoft Visual Studio .NET
development environment, Microsoft IIS 6.0 as a Web
Server and ASP.NET, C#.NET as a programming
language. To enable the interaction between .NET
program and the UDDI-compliant server, Microsoft
UDDI SDK 2.0 is used. We used the database as
Microsoft SQL Server, SQLExpress Edition 2005. The
QoS parameters considered for this approach are
response time, service availability, price and throughput.
In order to show the applicability of this approach, a
prototype is developed.
The experimental set up is carried out for four
scenarios with the QoS properties such as response time,
service availability, price and throughput. The QoS
attributes are evaluated based on the number of clients
and the available resources.
1.6 Scenario 1 (Discovery Process)
Table 2 summarizes the QoS requirements of the 4
service consumers such as RT in milliseconds, TP in
request per minute, price in dollar per transaction and
service availability (AV) in percentage. All consumers
specify that the maximum number of services to be
returned is 1. All the four consumers C1, C2 C3 and C4
concentrates on the four QoS attributes namely response
time, service availability, price and throughput. In this
experimental set up, ten functionally matched online
ticket booking services are considered and are returned
from the UDDI. These services are selected by the WSB
based on the QoS parameters such as response time,
service availability, price and throughput. From these set
of services, the WSB selects one best service and is
returned to each consumer based on their requirements
which are specified in Table 2.
Table 2
CONSUMERS’ QOS REQUIREMENTS
Consumer
Requirements
RT (ms) TP
(req/min)
Price(dollar
/transaction) AV (%)
Feedback
Rating
C1 965 9 0.04 97 7
C2 800 6 0.03 95 -
C3 1200 7 0.04 96 5
C4 1100 5 0.03 98 -
The values shown in Table 3 represent QoS values
measured through the implemented prototype for ten
different Web services. In order to find the most suitable
Web service, it is essential to optimize the values for
each QoS parameters. Table 3
QOS VALUES FOR VARIOUS AVAILABLE ONLINE
TICKET BOOKING SERVICES
Servic
e ID
Service
Provide
r Name
RT(ms
)
TP
(req/min
)
AV
(%
)
Price
(dollars
/transaction
)
AC
(%
)
IN
(%
)
Feedbac
k rating
1 SP1 850 8 97 0.04 92 95 7
2 SP2 710 12 10
0 0.05 96
10
0
8
3 SP3 900 7 95 0.01 95 94 6
4 SP4 700 11 10
0 0.03 98
10
0
9
5 SP5 1314 5 98 0.01 97 96 7
6 SP6 650 12 97 0.05 96 10
0
8
7 SP7 870 9 95 0.04 98 97 9
8 SP8 1350 4 10
0 0.012 93 95
6
9 SP9 1200 5 94 0.01 92 96 5
10 SP1
0 1150 6 96 0.02 96 94
6
For each consumer, the same service discovery request
was run 10 times and the service selected for each run is
shown in Fig 4. For C1, C2, C3 and C4, a service is
selected based on the response time, service availability,
price and throughput which meet the consumer
requirements. It is observed from the figure that the
services 2, 4 and 6 are chosen randomly for C1 as the
best services based on the requirements given in the
Table 2. For C2, the services randomly selected by WSB
are 2 and 4. Services 1, 2, 4, 6 and 7 are chosen randomly
for the consumer C3 as it satisfies the requirements
provided in Table 2. Finally, for the Consumer C4, the
services randomly chosen are 2 and 4 based on the
requirements.
International Journal of Emerging Technology and Advanced Engineering
Website: www.ijetae.com (ISSN 2250-2459 (Online), An ISO 9001:2008 Certified Journal, Volume 3, Special Issue 1, January 2013)
International Conference on Information Systems and Computing (ICISC-2013), INDIA.
Sri Sai Ram Engineering College, An ISO 9001:2008 Certified & NBA Accredited Engineering Institute, Chennai, INDIA. Page 8
1
2
3
4
5
6
7
8
9
10
1 2 3 4 5 6 7 8 9 10
Serv
ice
sele
cted
Discovery Request Sequence
C1 C2 C3 C4
Fig. 4. Service Selection
Validation of the other QoS attributes such as price
and reputation is also considered for the service selection
process.
1.7 Scenario 2 (Discovery Process)
Table 4 shows the QoS values measured for seven
Web services (Al-Masri and Mahmoud 2007). The QoS
parameter cost is represented in dollars and was provided
by the service provider. In addition, response time,
service availability, price and throughput values were
measured over a period of time.
In order to find the most suitable Web service, it is
important to optimize the values for each QoS parameter.
For instance, the Web service with lower response time is
preferable than the Web service with higher response
time.
Table 4
QOS METRICS FOR VARIOUS AVAILABLE EMAIL
VERIFICATION WEB SERVICES
ID Service
Provider Name
RT
(ms)
TP
(req/min) AV (%)
Price
(dollars/transaction)
S
1
XML Logic
Validate Email 720 6.00 85 0.012
S
2
XWebservices
XWebEmail-
Validation
1100 1.74 81 0.01
S
3
StrikeIron
Verification
710 12 98 0.01
S
4
StrikeIron
Email Address
Validator
912 10 96 0.07
S
5
CDYNE
Email Verifier 910 11 90 0.02
S
6
Webservicex
ValidateEmail 1232 4 87 0
S
7
ServiceObjec
ts DOTS Email
Validation
391 9 99 0.05
Table 5 summarizes the QoS attributes such as RT in
milliseconds, AV in percentage, TP in request per minute
and price in dollar per transaction of 3 service
consumers. All consumers specify that the maximum
number of services to be returned is 1. C1 mainly
focuses on the throughput and availability. When C2 and
C3 are considered, it concentrates on the four QoS
attributes.
Table 5
CONSUMERS’ QOS REQUIREMENTS
Consumer
Requirements
RT (ms) TP (req/min) AV (%) Price
(dollars)
C1 - 10 89 -
C2 750 5 90 0.06
C3 1100 7 99 0.01
International Journal of Emerging Technology and Advanced Engineering
Website: www.ijetae.com (ISSN 2250-2459 (Online), An ISO 9001:2008 Certified Journal, Volume 3, Special Issue 1, January 2013)
International Conference on Information Systems and Computing (ICISC-2013), INDIA.
Sri Sai Ram Engineering College, An ISO 9001:2008 Certified & NBA Accredited Engineering Institute, Chennai, INDIA. Page 9
For each consumer, the same service discovery request
was run 15 times and the service selected for each run is
shown in Fig 5. For C1, a service is selected based on the
throughput and availability. It is observed from the figure
that the services 3, 4 and 5 are chosen randomly as the
best services based on the requirements given in the
Table 5. For C2, the randomly selected services are 3 and
5. Service 3 is chosen for the consumer C3 as it satisfies
the requirements provided in Table 5.
Fig. 5. Service Selection
1.8 Scenario 3 (Verification Process for Registration)
This section presents the scenario 3 of the
experimental results. This simulation model comprises of
a single broker, a Web service and N concurrent clients.
This section mainly focuses on the service registration.
The QoS parameters taken into consideration are
response time, throughput and service availability. The
following test cases are carried out to verify the QoS
properties. Consider a service provider which would
provide service for the registration process. The proposed
WSB approach evaluates the given the services and
issues certificate or send appropriate feedback to the
provider based on the verification results.
1.8.1 Response Time
Table 6 shows the response time observation by the
proposed WSB architecture. The values are tabulated for
varying the number of clients. From the Table, it is
observed that the response time varies with the different
number of clients. With increase in the number of clients,
the response time also increases.
Table 6
EVALUATION OF RESPONSE TIME FOR ONLINE TICKET
BOOKING SERVICE
Number of
Clients
Response Time
(ms)
20 400
40 512
100 705
140 1050
180 1498
220 1654
260 2145
0
1000
2000
3000
4000
20 40 100 140 180 220 260
Re
sp
on
se T
ime (
ms)
Number of Clients
Average ResponseTime
Fig. 6. Distribution of RT with Increased Number of Clients
Fig 6 illustrates the graphical representation of the
Response Time for the online ticket booking service. It is
observed from the Fig 6 that the response time increases
linearly with the number of clients. For instance, when
the number of clients is 260, the response time taken is
2145 milliseconds.
1.8.2 Service Availability
Table 7 shows the service availability evaluation of the
proposed WSB approach. Initially, the service
availability is steady and it is 100% till 30 clients. Then
the service availability fluctuates. Then, when the
number of client become 100, the service availability
becomes steady and it reaches 100% until it reaches the
fluctuation points where the service availability decreases
for 220 and 240 clients.
International Journal of Emerging Technology and Advanced Engineering
Website: www.ijetae.com (ISSN 2250-2459 (Online), An ISO 9001:2008 Certified Journal, Volume 3, Special Issue 1, January 2013)
International Conference on Information Systems and Computing (ICISC-2013), INDIA.
Sri Sai Ram Engineering College, An ISO 9001:2008 Certified & NBA Accredited Engineering Institute, Chennai, INDIA. Page 10
Table 7
EVALUATION OF SERVICE AVAILABILITY FOR ONLINE
TICKET BOOKING SERVICE
Number of
Clients
Service Availability
(%)
20 100
30 100
40 97
50 98
100 100
140 100
180 100
220 94
260 90
The graphical representation of the service availability
of the proposed broker based approach is shown in Fig 7.
80
85
90
95
100
20 30 40 50 100 140 180 220 260
Se
rvic
e A
va
ilab
ilit
y (
%)
ServiceAvailability
Fig. 7. Distribution of Service Availability with Increased Number
of Clients
1.8.3 Throughput
Table 8 shows the throughput values of online ticket
booking service with increasing number of clients. When
the number of clients increases, there is slight decrease in
the throughput values.
Table 8
EVALUATION OF THROUGHPUT FOR ONLINE TICKET
BOOKING SERVICE
Number of
Clients
Throughput
(req/min)
20 17
30 17
40 15
50 13
100 11
140 8
180 6
220 5
260 4
Fig 8 illustrates the graphical representation of the
throughput request per minute for online ticket booking
service.
Fig. 8. Distribution of Throughput with Increased Number of
Clients
The results of the scenario 3 show that the values of
Response time, service availability and throughput are
fluctuating with various numbers of clients. Thus, it is
concluded from this scenario that the number of client
have a significant effect on response time, availability of
the service and throughput.
International Journal of Emerging Technology and Advanced Engineering
Website: www.ijetae.com (ISSN 2250-2459 (Online), An ISO 9001:2008 Certified Journal, Volume 3, Special Issue 1, January 2013)
International Conference on Information Systems and Computing (ICISC-2013), INDIA.
Sri Sai Ram Engineering College, An ISO 9001:2008 Certified & NBA Accredited Engineering Institute, Chennai, INDIA. Page 11
Finally, the results of the validation test cases show
significant impact on the server resources capacity, the
number of connected client, the network load on response
time, throughput and the availability of the Web service.
1.9 Scenario 4 (Verification Process for Registration)
The functionally matched service obtained from the
UDDI registry is verified by the Web Service Broker
based on the QoS parameters such as response time,
service availability, price and throughput.
1.9.1 Response Time
Response time of the online ticket booking service is
evaluated in Fig 9. The Figure illustrates the graphical
representation of the response time for 10 discovery
request sequence. It is observed from the Fig 9 that the
response time lies within the range of 700 to 780
milliseconds for all the ten discovery request sequence.
The highest response time taken by the given ticket
booking service is 773 milliseconds. The average
response time taken by the given online ticket booking
service is 758.8 milliseconds. The average response time
of the ticket booking service is compared with the service
response time which is given in the tModel data structure
in the UDDI Registry. If the response time is matched
with the values in the tModel then, the service would be
selected. If the value is not matched up with the values in
the UDDI registry, then the service would not be selected
by the WSB.
Fig. 9. Response Time Verification
1.9.2 Throughput
The throughput results of the online ticket booking
service are shown in Fig. 10. The throughput is obtained
for the ten discovery request sequences. The highest
throughput obtained for ticket booking service is 12
requests per minute.
The obtained throughput values of the ticket booking
service from WSB are compared with the throughput
values which are given in the tModel data structure in the
UDDI Registry. If the throughput is matched with the
values in the tModel then, the service would be selected.
If the value is not matched up with the values in the
UDDI registry, then the service would not be selected by
the WSB.
Fig. 10. Throughput Verification
1.9.3 Service Availability
Similarly, Figure 4.13 shows the service availability
verification process. The service availability is obtained
for the ten discovery request sequences. The highest
service availability obtained for ticket booking service is
100%. For most of the discovery request sequences, the
service availability is 100%. For example, for the
discovery request sequence such as 1, 2, 3, 5, 6, 8 and 10,
the service availability obtained is 100%.
The obtained service availability values of the ticket
booking service from WSB are compared with the
service availability values which are given in the tModel
data structure in the UDDI Registry. If the service
availability is matched with the values in the tModel
then, the service would be selected. If the value is not
matched up with the values in the UDDI registry, then
the service would not be selected by the WSB.
International Journal of Emerging Technology and Advanced Engineering
Website: www.ijetae.com (ISSN 2250-2459 (Online), An ISO 9001:2008 Certified Journal, Volume 3, Special Issue 1, January 2013)
International Conference on Information Systems and Computing (ICISC-2013), INDIA.
Sri Sai Ram Engineering College, An ISO 9001:2008 Certified & NBA Accredited Engineering Institute, Chennai, INDIA. Page 12
Fig. 11. Service Availability Verification
After the verification process, the services which are
all matched with QoS requirements given in the interface
are selected by the WSB. Then, the best services selected
by the WSB are given to the consumer.
The proposed WSB will exploit the results of the
above experiments to evaluate the response time, service
availability and throughput of the service to its provider.
Validation of the QoS attributes such as response time,
service availability and throughput is achieved by the
WSB for online ticket booking service. Then, the verifier
analyses the results obtained. Verifier also performs the
verification of the QoS information given in the service
interface. The service verifier store all important QoS
information published in the service interface in its
database.
Once the verification process is completed
successfully, the certification process is initiated. The
certifier issues a certificate to the service provider
through the service publisher which indicates that the
offered QoS confirm to their descriptions. If the
evaluated results of online ticket booking service do not
meet the specified QoS descriptions in the service
interface, then the certifier provides an appropriate
feedback to the service provider through the service
publisher.
1.10 Summary
This section explores a broker-based architecture
for Web services registration and discovery with QoS.
The goal of the broker is to support Web services
discovery with QoS registration, verification,
certification and confirmation. The broker performs the
process of publishing and selection of Web services.
The key features of the broker that are not supported
by existing approaches dealing with QoS for Web
services are focused in this approach.
The service provider does not have to design and
develop her/his own broker but just invoke one from the
already published brokers from the Internet. The client
will also find a good support during its Web services
discovery using the broker services. This will enable a
more flexible, and trustable architecture. The
experimental results are carried out in two scenarios with
QoS attributes such as Response Time, Service
Availability and Throughput. The experimentation results
confirmed the importance of QoS in distinguishing the
functionally similar Web services.
V. CONCLUSION
In this paper, the WSB performs the process of
registering and selection of Web services. The key
features of the broker that are not supported by existing
approaches dealing with QoS for web services are
described. The service provider does not have to design
and develop her/his own broker but just invoke one from
the published brokers. The client will also find a good
support during its Web services discovery using the
broker services. The goal of the broker is to support Web
services discovery with QoS registration, verification,
certification, and confirmation. This approach helps in
identifying the most suitable Web service according to
the consumer‟s requirements. Security techniques can be
included in the proposed approaches for a more flexible
and trustable architecture. Another future work would be
to enhance the adaptability of the present Web service
architecture to support mobile Web services and also to
enhance the capabilities of the proposed architecture to
handle other QoS attributes.
REFERENCES
[1] S. Ran, “A Model for Web Services Discovery with QoS”, ACM
SIGEcom Exchanges, Vol. 4(1), pp.1–10, 2003.
[2] W3C, “QoS for Web Services: Requirements and Possible Approaches”. Available: http://www.w3c.or.kr/kr-
office/TR/2003/NOTE-ws-qos-20031125/, 2003.
[3] IBM Corporation, “Web Service Level Agreement (WSLA) Language Specification” Ver. 1.0. Retrieved from
http://www.research.ibm.com/wsla/WSLASpecV1-20030128.pdf, 2003.
[4] W3C, “WS-Policy Framework ver.1.2”, Available at:
http://www.w3.org/Submission/WS-Policy/, 2006. [5] DAML Services,”DAML-S / OWL-S”, Available at:
http://www.daml.org/services/owl-s/, 2006.
[6] E.M. Maximilien and M.P. Singh, “Reputation and Endorsement for Web Services”, ACM SIGecom Exchanges, Vol. 3(1), pp.24–
31, 2002.
[7] E.M. Maximilien and M.P. Singh, “Self-Adjusting Trust and Selection for Web Services”, In extended Proc. Of 2nd IEEE Intl.
conf. on Autonomic Computing (ICAC), pp.385-386, 2005.
International Journal of Emerging Technology and Advanced Engineering
Website: www.ijetae.com (ISSN 2250-2459 (Online), An ISO 9001:2008 Certified Journal, Volume 3, Special Issue 1, January 2013)
International Conference on Information Systems and Computing (ICISC-2013), INDIA.
Sri Sai Ram Engineering College, An ISO 9001:2008 Certified & NBA Accredited Engineering Institute, Chennai, INDIA. Page 13
[8] L. Vu, M. Hauswirth and K. Aberer, “QoS-based service
selection and ranking with trust and reputation management”. In
Proc. of the Intl. Conf. on Cooperative Information Systems (CoopIS), Agia Napa, Cyprus, 2005.
[9] E.M. Maximilien and M.P. Singh, “Towards Autonomic Web
Services, Trust and Selection”, ICSOC‟04, pp.212–221, November 2004.
[10] A. Blum, “UDDI as an Extended Web Services Registry:
Versioning, quality of service, and more”, White paper, SOA World magazine, Vol. 4(6), 2004.
[11] Shrabani Mallick and D.S. Kushwaha, “An Efficient Web service
Discovery Architecture”, International Journal of Computer
Applications, Volume 3-No 12, July 2010.
[12] S. Majitha, A. Shaikhali, O. Rana, and D. Walker, “Reputation
based semantic service Discovery”, In Proc. Of the 13 th IEEE Intl Workshops on Enabling Technologies Infrastructures for
collaborative Enterprises (WETICE), Modena, Italy, pp.297-302,
2004. [13] T. Rajendran and P. Balasubramanie, “An Efficient Framework
for Agent-Based Quality Driven Web Services Discovery”, IEEE
International conference on Intelligent Agent and Multi Agent Systems (IAMA2009), Chennai, 2009.
[14] A. Keller and H. Ludwing. “The WSLA framework: Specifying
and Monitoring Service Level Agreements for Web Services”, IBM Research Report, 2002.
[15] P. Farkas and H. Charaf, “Web Services Planning Concepts”, 1st
International Workshop on C# and .NET Technologies on Algorithms, Computer Graphics, Visualization, Distributed and
WEB Computing, Feb. 2003
[16] A. ShaikAli, O.F. Rana, R. Al-Ali, and D.W. Walker, “UDDIe:
An extended registry for Web services”, In Proc. Of the
Symposium on Applications and the Internet workshops, IEEE
CS, pp.85-89, 2003. [17] Z. Chen, C. Liang-Tien, B. Silverajan, and L. Bu-Sung, “UX-An
architecture providing QoS-aware and federated support for
UDDI”, In Proc. of the Int‟l Conf. on Web services, CSREA Press, pp.171-176, 2003.
[18] S. Kalepu, S. Krishnaswamy, and S.W. Loke, “Reputation = f
(User Ranking, Compliance, Verity)”, Proceedings of ICWS'04, 2004.
[19] Y. Liu, A. Ngu, and L. Zheng, “QoS Computation and Policing in
Dynamic Web Service Selection”, Proceedings of WWW 2004 Conf, 2004.
[20] Diego Zuquim Guimaraes Garcia and Maria Beatriz Felgar de
Toledo, “A Web service Architecture providing QoS Management”, Web Congress, LA-Web '06, Fourth Latin
American, pp -189-198, 2006.
[21] M. Tian, A. Gramm, H. Ritter, and J. Schiller, “Efficient Selection and Monitoring of QoS aware Web Services with the WS-QoS
Framework”, Proceedings of the IEEE/WIC/ACM International
Conference on Web Intelligence (WI‟04) Exchanges, vol. 4, no. 1, pp. 1–10, 2004.
[22] Eyhab Al- Masri and Qusay H. Mahmoud, “QoS-based Discovery
and Ranking of Web services”, Proceedings of IEEE International
Conference, 2007.
[23] Ziqiang Xu, Patrick Martin, Wendy Powley, and Farhana
Zulkernine, “Reputation Enhanced QoS-based Web services
Discovery”, IEEE International Conference on Web Services (ICWS 2007), 2007.
[24] Demian Antony D‟ Mello, V.S. Ananthanarayana, and T. Santhi,
“A QoS Broker Based Architecture for Web Service Selection”, Proceedings of IEEE International Conference, 2008.
[25] T. Rajendran and P. Balasubramanie, “An Agent-Based Dynamic
Web Service Discovery Framework with QoS Support”, International J. of Engg. Research & Indu. Appls. (IJERIA), Vol.
2(5), pp 1-13, 2009.
[26] Demian A. D‟Mello and V.S. Ananthanarayana, “A QoS Model and Selection Mechanism for QoS-Aware Web Services”,
Proceedings of the International Conference on Data Management
(ICDM 2008), 2008. [27] M.A. Serhani, R. Dssouli, A. Hafid, and H. Sahraoui, “A QoS
broker based architecture for efficient Web services selection”. In
Proc. of the IEEE Int‟l Conf. on Web Services, IEEE CS, pp.113–120, 2005.
[28] Hongan Chen, Tao Yu, and Kwei-Jay Lin, “QCWS: an
implementation of QoS-capable multimedia Web services”, IEEE Fifth International Symposium on Multimedia Software
Engineering, 2003.
[29] R. Wishart, R. Robinson, J. Indulska, and A. Josang, “SuperstringRep: Reputation-enhanced Service Discovery”. In
Proc. of the 28th Australasian conf. on Computer Science, Vol.
38, pp.49-57, 2005. [30] J. Hu, C. Guo, H. Wang, and P. Zou, 2005, “Quality Driven Web
Services Selection” Proceedings of the IEEE International
Conference on e-Business Engineering (ICEBE‟05), 2005. [31] M. Crasso, J.M. Rodriguez, A. Zunino, and M. Campo,
“Improving Web service descriptions for effective service
discovery”, Sci. Comp. Prog., 75, pp.1001-1021, 2010. [32] Vuong Xuan Tran, Hidekazu Tsu ji, and Ryosuke Masuda, “A
new QoS ontology and its QoS-based ranking algorithm for Web
services”, In Elsevier Journal: Simulation Modelling Practice and Thoery, 2009.
[33] Mydhili K. Nair and Dr. V. Gopalakrishna, “Look Before You
Leap: A Survey of Web Service Discovery”, International Journal of Computer Applications, Volume 7, No.5, pp.22-30, September
2010.
[34] Jaleh Shoshtarian Malak, Mohsenzadeh, M. and Seyyedi, M.A. “Multi agent based Web Service QoS management architecture”,
14th International CSI Computer Conference (CSICC), pp. 280–
286,2009.