A netting clustering analysis method under intuitionistic fuzzy environment

7
Applied Soft Computing 11 (2011) 5558–5564 Contents lists available at ScienceDirect Applied Soft Computing j ourna l ho me p age: www.elsevier.com/l ocate/asoc A netting clustering analysis method under intuitionistic fuzzy environment Zhong Wang, Zeshui Xu , Shousheng Liu, Jian Tang Institute of Sciences, PLA University of Science and Technology, Nanjing 211101, China a r t i c l e i n f o Article history: Received 14 January 2010 Received in revised form 2 September 2010 Accepted 1 May 2011 Available online 6 May 2011 Keywords: Intuitionistic fuzzy set Intuitionistic fuzzy relation Intuitionistic fuzzy similarity matrix Netting method Clustering analysis a b s t r a c t In this paper, we investigate the technique for clustering objects with intuitionistic fuzzy information. We first propose a formula to derive the intuitionistic fuzzy similarity degree between two intuitionistic fuzzy sets and develop an approach to constructing an intuitionistic fuzzy similarity matrix. Then, we present a netting method to make clustering analysis of intuitionistic fuzzy sets via intuitionistic fuzzy similarity matrix. Two numerical examples are given to illustrate and verify our method. © 2011 Published by Elsevier B.V. 1. Introduction Atanassov [1] introduced the concept of intuitionistic fuzzy set (IFS), which is the generalization of fuzzy set originally introduced by Zadeh [2]. Since its appearance, IFS has been investigated by many researchers and applied to many fields, such as decision- making [3–6], pattern recognition [7–10], market prediction [11] and medical diagnosis [12,13]. Clustering analysis is a well- established technique for sorting observations into clusters such that each cluster is as homogeneous as possible [14]. How to clus- ter intuitionistic fuzzy information is an important research topic, which has been receiving great attention from researchers recently [15–17]. Zhang et al. [15] proposed a clustering technique of IFSs on the basis of the -cutting matrix of an interval-valued matrix. Xu and Yager [18,19] gave a clustering technique by transforming an association matrix into an equivalent association matrix, from which a k-cutting matrix is derived and used to cluster the given IFSs. Cai et al. [17] presented a clustering method based on the intu- itionistic fuzzy equivalent dissimilarity matrix and (˛,ˇ)-cutting matrices. However, all these intuitionistic fuzzy clustering methods are on the basis of intuitionistic fuzzy equivalence matrices and the transitive closure technique, which needs a large amount of com- putational efforts and takes a lot of time to accomplish. In order to overcome this issue and make the clustering process more effec- The work was supported in part by the National Science Fund for Distinguished Young Scholars of China (No. 70625005), and the National Natural Science Founda- tion of China (No. 71071161). Corresponding author. E-mail address: xu [email protected] (Z. Xu). tive, in this paper, we shall develop a netting method for clustering the objects with intuitionistic fuzzy information. The remainder of the paper is organized as follows. In Section 2, we give a brief review of some basic knowledge related to IFSs. Section 3 presents a formula to derive the intuitionistic fuzzy sim- ilarity degree between IFSs, and then develops an approach to constructing intuitionistic fuzzy similarity matrix. Based on the netting technique, in Section 4, we propose a method for clustering the objects which are represented by IFSs. Section 5 illustrates and verifies our method with two numerical examples, and Section 6 concludes the paper with a summary and some remarks. Briefly, Fig. 1 provides a system diagram to show the main thought of this paper. 2. Preliminaries Let a set X be fixed, then an intuitionistic fuzzy set (IFS) A on X is defined by Atanassov [1] as A = {x, A (x)v A (x) > |x X}, where A :X [0,1], x A (x) [0,1] and v A :X [0,1], x X v A [0,1] denote a membership function and a non-membership function, respectively. Furthermore, A (x) + v A (x) 1, for any x X, and A (x) = 1 A (x) v A (x) is called an uncertainty (or hesitation) function of x to A. Especially, if A (x) = 0, then A reduces to a fuzzy set. For convenience, Xu [18] called ˛ = ( ˛ , v ˛ ) an intuitionistic fuzzy number (IFN), where ˛ [0, 1], v ˛ [0, 1], ˛ + v ˛ 1, and ˛ = 1 ˛ v ˛ . Based on IFSs, we introduce some basic concepts as below: Definition 1 ([15]). Let Z = (z ij ) m×n be an m × n matrix, if all z ij (i = 1, 2, . . ., m ; j = 1, 2, . . ., n) are IFNs, then Z is called an intuitionistic fuzzy matrix. 1568-4946/$ see front matter © 2011 Published by Elsevier B.V. doi:10.1016/j.asoc.2011.05.004

Transcript of A netting clustering analysis method under intuitionistic fuzzy environment

Page 1: A netting clustering analysis method under intuitionistic fuzzy environment

A

ZI

a

ARRAA

KIIINC

1

(bmmaettw[oXawIimatpo

Yt

1d

Applied Soft Computing 11 (2011) 5558–5564

Contents lists available at ScienceDirect

Applied Soft Computing

j ourna l ho me p age: www.elsev ier .com/ l ocate /asoc

netting clustering analysis method under intuitionistic fuzzy environment�

hong Wang, Zeshui Xu ∗, Shousheng Liu, Jian Tangnstitute of Sciences, PLA University of Science and Technology, Nanjing 211101, China

r t i c l e i n f o

rticle history:eceived 14 January 2010eceived in revised form 2 September 2010ccepted 1 May 2011vailable online 6 May 2011

a b s t r a c t

In this paper, we investigate the technique for clustering objects with intuitionistic fuzzy information.We first propose a formula to derive the intuitionistic fuzzy similarity degree between two intuitionisticfuzzy sets and develop an approach to constructing an intuitionistic fuzzy similarity matrix. Then, wepresent a netting method to make clustering analysis of intuitionistic fuzzy sets via intuitionistic fuzzysimilarity matrix. Two numerical examples are given to illustrate and verify our method.

eywords:ntuitionistic fuzzy setntuitionistic fuzzy relationntuitionistic fuzzy similarity matrixetting method

© 2011 Published by Elsevier B.V.

lustering analysis

. Introduction

Atanassov [1] introduced the concept of intuitionistic fuzzy setIFS), which is the generalization of fuzzy set originally introducedy Zadeh [2]. Since its appearance, IFS has been investigated byany researchers and applied to many fields, such as decision-aking [3–6], pattern recognition [7–10], market prediction [11]

nd medical diagnosis [12,13]. Clustering analysis is a well-stablished technique for sorting observations into clusters suchhat each cluster is as homogeneous as possible [14]. How to clus-er intuitionistic fuzzy information is an important research topic,hich has been receiving great attention from researchers recently

15–17]. Zhang et al. [15] proposed a clustering technique of IFSsn the basis of the �-cutting matrix of an interval-valued matrix.u and Yager [18,19] gave a clustering technique by transformingn association matrix into an equivalent association matrix, fromhich a k-cutting matrix is derived and used to cluster the given

FSs. Cai et al. [17] presented a clustering method based on the intu-tionistic fuzzy equivalent dissimilarity matrix and (˛,ˇ)-cutting

atrices. However, all these intuitionistic fuzzy clustering methodsre on the basis of intuitionistic fuzzy equivalence matrices and the

ransitive closure technique, which needs a large amount of com-utational efforts and takes a lot of time to accomplish. In order tovercome this issue and make the clustering process more effec-

� The work was supported in part by the National Science Fund for Distinguishedoung Scholars of China (No. 70625005), and the National Natural Science Founda-ion of China (No. 71071161).∗ Corresponding author.

E-mail address: xu [email protected] (Z. Xu).

568-4946/$ – see front matter © 2011 Published by Elsevier B.V.oi:10.1016/j.asoc.2011.05.004

tive, in this paper, we shall develop a netting method for clusteringthe objects with intuitionistic fuzzy information.

The remainder of the paper is organized as follows. In Section2, we give a brief review of some basic knowledge related to IFSs.Section 3 presents a formula to derive the intuitionistic fuzzy sim-ilarity degree between IFSs, and then develops an approach toconstructing intuitionistic fuzzy similarity matrix. Based on thenetting technique, in Section 4, we propose a method for clusteringthe objects which are represented by IFSs. Section 5 illustrates andverifies our method with two numerical examples, and Section 6concludes the paper with a summary and some remarks.

Briefly, Fig. 1 provides a system diagram to show the mainthought of this paper.

2. Preliminaries

Let a set X be fixed, then an intuitionistic fuzzy set (IFS) A onX is defined by Atanassov [1] as A = {x, �A(x)vA(x) > |x ∈ X}, where�A:X → [0,1], x ∈ → �A(x) ∈ [0,1] and vA:X → [0,1], x ∈ X → vA ∈ [0,1]denote a membership function and a non-membership function,respectively. Furthermore, �A(x) + vA(x) ≤ 1, for any x ∈ X, and�A(x) = 1 − �A(x) − vA(x) is called an uncertainty (or hesitation)function of x to A. Especially, if �A(x) = 0, then A reduces to a fuzzyset. For convenience, Xu [18] called ̨ = (�˛, v˛) an intuitionisticfuzzy number (IFN), where �˛ ∈ [0, 1], v˛ ∈ [0, 1], �˛ + v˛ ≤ 1, and�˛ = 1 − �˛ − v˛.

Based on IFSs, we introduce some basic concepts as below:

Definition 1 ([15]).Let Z = (zij)m×n

be an m × n matrix, if all zij (i = 1, 2, . . ., m ; j = 1, 2,. . ., n) are IFNs, then Z is called an intuitionistic fuzzy matrix.

Page 2: A netting clustering analysis method under intuitionistic fuzzy environment

Z. Wang et al. / Applied Soft Comp

DL

R

i1y

D

((

t

Dm

((

t

3s

imtafs

sc

ij kj

Fig. 1. The main thought of this paper.

efinition 2 ([20]).et X and Y be two non-empty sets, then

= {< (x, y), �R(x, y), vR(x, y) > |x ∈ X, y ∈ Y}

s called an intuitionistic fuzzy relation, where �R : X × Y → [0,], vR : X × Y → [0, 1], and 0 ≤ �R(x, y) + vR(x, y) ≤ 1, for any (x,) ∈ X × Y.

efinition 3 ([20]). Let R be an intuitionistic fuzzy relation, if

1) (Reflexivity): �R(x,x) = 1, �R(x,x) = 0, for any x ∈ X;2) (Symmetry): �R(x,y) = �R(y,x), �R(x,y) = �R(y,x), for any (x, y) ∈ X × Y,

hen R is called an intuitionistic fuzzy similarity relation.

efinition 4 ([3]). Let Z = (zij)n×nbe an n × n intuitionistic fuzzy

atrix, where zij = (�ij, vij), i, j = 1, 2, . . ., n, if

1) (Reflexivity): zii = (1, 0), for all i = 1, 2, . . ., n;2) (Symmetry): zij = zji, i.e., �ij = �ji, vij = vji, for all i, j = 1, 2, . . ., n,

hen Z is called an intuitionistic fuzzy similarity matrix.

. A new approach to constructing intuitionistic fuzzyimilarity matrix

Considering that the aim of this paper is to construct the intu-tionistic fuzzy similarity matrix, and then utilize it to derive a

ethod for clustering analysis. To achieve this, we generally needo consider a multiple attribute decision making (MADM) problem,nd then get an intuitionistic fuzzy matrix. After that, we shall seekor a method to construct an intuitionistic fuzzy similarity matrix

o as to do clustering analysis.

Now we consider a MADM problem, let Y = {Y1, Y2, . . ., Ym} be aet of alternatives, and G = {G1, G2, . . ., Gn} be a set of attributes. Theharacteristic of each alternative Yi under all the attributes Gj (j = 1,

uting 11 (2011) 5558–5564 5559

2, . . ., n) is represented as an IFS:

Yi = {< Gj, �Yi(Gj), vYi

(Gj) > |Gj ∈ G},i = 1, 2, . . . , m; j = 1, 2, . . . , n (1)

where �Yi(Gj) denotes the membership degree of Yi to Gj, and

vYi(Gj) denotes the non-membership degree of Yi to Gj. Obviously,

�Yi(Gj) = 1 − �Yi

(Gj) − vYi(Gj) is the uncertainty (or hesitation)

degree of Yj to Gj. If let zij = (�ij, vij) = (�Yi(Gj), vYi

(Gj)), which isan IFN, then based on these IFNs zij (i = 1, 2, . . ., m ; j = 1, 2, . . ., n),we can construct an m × n intuitionistic fuzzy matrix Z = (zij)m×n

.Next, we shall develop an approach to constructing an intuition-

istic fuzzy similarity matrix based on the intuitionistic fuzzy matrixZ = (zij)m×n

.For any two alternatives Yi and Yk, we first use the normalized

Hamming distance to get the average value of the absolute devia-tions of the non-membership degrees vij and vkj, for all j = 1, 2, . . .,n:

d1(Yi, Yk) = 1n

n∑j=1

|vij − vkj|, i, k = 1, 2, . . . , m (2)

Analogously, we get the average value of the absolute deviationsof the membership degrees �ij and �ig, for all j = 1, 2, . . ., n:

d2(Yi, Yk) = 1n

n∑j=1

|�ij − �kj|, i, k = 1, 2, . . . , m (3)

Obviously, distances (2) and (3) show the closeness degrees ofthe characteristics of each two alternatives Yj and Yk. The smallerthe values of d1(Yi, Yk) and d2(Yi, Yk) are, the more similar the twoalternatives Yj and Yk.

In an intuitionistic fuzzy similarity matrix, each of its elementsis an IFN. To get an intuitionistic fuzzy closeness degrees of Yj andYk, we may consider the value of d1(Yi, Yk) as a non-membershipdegree v̇ik, and then it may be hopeful to define

�̇ik = 1 − 1n

n∑j=1

|�ij − �kj|, i, k = 1, 2, . . . , m (4)

as a membership degree. Now we need to check whether 0 ≤ �̇ik +v̇ik ≤ 1 holds or not. However,

�̇ik + �̇ik = 1 − 1n

n∑j=1

|�ij − �kj| + 1n

n∑j=1

|vij − vkj| ≥ 0

�̇ik + v̇ik = 1 − 1n

n∑j=1

|�ij − �kj| + 1n

n∑j=1

|vij − vkj|

= 1 − 1n

n∑j=1

|(1 − �ij) − (1 − �kj)| + 1n

n∑j=1

|vij − vkj|

= 1 − 1n

n∑j=1

|(vij + �ij) − (vkj + �kj)| + 1n

n∑j=1

|vij − vkj|

= 1 − 1n

n∑j=1

|(vij − vkj) + (�ij − �kj)| + 1n

n∑j=1

|vij − vkj|

≥ 1 − 1n

n∑j=1

|vij − vkj| − 1n

n∑j=1

|�ij − �kj| + 1n

n∑j=1

|vij − vkj|

= 1 − 1n∑

|� − � , i, k = 1, 2, . . . , m

n

j=1

where �̇ij = 1 − �̇ij − v̇ij . Thus, 0 ≤ �̇ik + v̇ik ≤ 1 cannot be guaran-teed.

Page 3: A netting clustering analysis method under intuitionistic fuzzy environment

5 Comp

ma(

w

Db

R

t

Ti

P

(

(

�a.

aij

�a.

((

560 Z. Wang et al. / Applied Soft

In the numerical analysis above, we can see that in an IFN, theembership degree is closely related to both the non-membership

nd the uncertainty degree. Motivated by this idea, we may modify4) as:

˙ ik = 1 − 1n

n∑j=1

|vij − vkj| − 1n

n∑j=1

|�ij − �kj|, i, k = 1, 2, . . . , m (5)

ith �ik = 1 if and only v̇ij = v̇kj and �̇ij = �̇kj , for all j = 1, 2, . . ., n.Based on (2) and (5), we have the following concept:

efinition 5. Let Yi and Yk be two IFSs on X, and R(Yi, Yk) be ainary relation on X × X, if

(Yi, Yk) =⎧⎨⎩

(1, 0), Yi = Yk(1 − 1

n

n∑j=1

∣∣vij − vkj

∣∣− 1n

n∑j=1

∣∣�ij − �kj

∣∣ ,1n

n∑j=1

∣∣vij − vkj

∣∣) , Yi /= Yk

(6)

hen R(Yi, Yk) is called a closeness degree of Yi and Yk.

By (6), we have

heorem 1. The closeness degree R(Yi, Yk) of Yi and Yk is an intu-tionistic fuzzy similarity relation.

roof.

1) Let us first prove that R(Yi, Yk) is an IFN:

(a) If Yi /= Yk, then R(Yi, Yk) = (1, 0);b) If Yi /= Yk, then

�̇ik = 1 − 1n

n∑j=1

|vij − vkj | − 1n

n∑j=1

|�ij − �kj | ≤ 1 − 1n

n∑j=1

|vij − vkj + �ij − �kj |

= 1 − 1n

n∑j=1

|�ij − �kj |

Obviously, we have 0 ≤ �̇ik ≤ 1, with �̇ik = 1 if and only ifij = �kj, for all j = 1, 2, . . ., n, and with �̇ik = 0 if and only if �ij = 1

nd �kj = 0, for all j = 1, 2, . . ., n, or �ij = 0 and �kj = 1, for all j = 1, 2, . ., n.

Similarly, we have 0 ≤ v̇ik = 1n

∑nj=1|vij − vkj| ≤ 1, with v̇ik = 1 if

nd only if vij = vkj , for all j = 1, 2, . . ., n, and with v̇ik = 0 if and onlyf vij = 1 and vkj = 0, for all j = 1, 2, . . ., n, or vij = 0 and vkj = 1, for all

= 1, 2, . . ., n.Also since

�̇ik + v̇ik = 1 − 1n

n∑j=1

|vij − vkj| − 1n

n∑j=1

|�ij − �kj| + 1n

n∑j=1

|vij − vkj|

= 1 − 1n

n∑j=1

|�ij − �kj| ≤ 1

then we have 0 ≤ �̇ij + v̇kj ≤ 1, with �̇ik + v̇ik = 1 if and only ifij = �kj, for all j = 1, 2, . . ., n, and �̇ik + v̇ik = 0, if and only if �ij = 1nd �kj = 0, for all j = 1, 2, . . ., n, or �ij = 0 and �kj = 1, for all j = 1, 2,

. ., n.

2) Since R(Yi, Yi) = (1, 0), then R is reflexive;3) Since |vij − vkj| = |vkj − vij| and |�ij − �kj| = |�kj − �ij|, then R(Yi,

Yk) = R(Yk, Yi), i.e., R is symmetrical. Thus, R(AB) is an intuition-istic fuzzy similarity relation.

uting 11 (2011) 5558–5564

From (6), we can see that if all the differences of the non-membership degrees and the differences of the uncertainty degreesof two alternatives Yi and Yk with respect to the attributes Gj (j = 1,2, . . ., n) get smaller, then the two alternatives are more similar toeach other. �

In the following section, we shall use (6) to develop a new clus-tering method.

4. A netting method for clustering the objects withintuitionistic fuzzy information

The so called netting means a simple process: firstly, for an intu-itionistic fuzzy similarity matrix Z, we should choose a confidencelevel � ∈ [0, 1], and then get a �-cutting matrix Z� and change theelements on the diagonal of Z� with the symbol of the alternatives.Under the diagonal, we replace ‘1’ with nodal point ‘*’ and ignoreall the ‘0’ in Z�. From the node ‘*’, we draw vertical line and hor-izontal line to the diagonal and the corresponding alternatives onthe diagonal will be clustered into one class [21].

Netting method was first used to cluster data in the field offuzzy mathematics [21]. With this method, we can get the cluster-ing results by ‘netting” the elements of similarity matrix directly.In the following, we propose a netting method for clustering theobjects with intuitionistic fuzzy information.

Step 1. For a MADM problem, Let Y = {Y1, Y2, . . ., Ym} and G = {G1,G2, . . ., Gn} be defined as in Section 3, and assume that thecharacteristics of the alternatives Yi (i = 1, 2, . . ., m) withrespect to the attributes Gj (j = 1, 2, . . ., n) are representedas in (1).

Step 2. Construct the intuitionistic fuzzy similarity matrix Z =(zij)m×m

by using (6), where zij is an IFN, and zij = (�ij, vij) =Z(Yi, Yj), i, j = 1, 2, . . ., m.

Step 3. Delete all the elements above the diagonal and replace theelements on the diagonal with the symbol of the alterna-tives.

Step 4. Choose the confidence level � and construct the corre-sponding �-cutting matrix. Replace ‘1’ with ‘*’ and deleteall the ‘0’ in the matrix before drawing the vertical and hor-izontal line to the symbol of alternatives on the diagonalfrom ‘*’. Corresponding to each ‘*’, we have a class whichcontains two elements. Unit the classes together whichhave the common elements, and then we get the classes cor-responding to the selected �. Update the values of � beforeall the alternatives are clustered into one class.

The principal to choose �: Based on the idea of constructing thesimilarity degree matrix in this paper, we balance the similaritydegree of two alternatives mainly through the membership degreeof the corresponding IFN. We choose the confidence level � fromthe biggest one to the smallest one. When we encounter that twomembership degrees are equal, we firstly choose the one with thesmaller non-membership degree. If both of them are equal, they areclustered into the same class. After that, in terms of the chosen �, weconstruct the corresponding �-cutting matrix. With this principle,the clustering results will be more detailed.

5. Illustrative examples

Example 1. An auto market wants to classify five different cars

Yi (i = 1, 2, 3, 4, 5) into several kinds [15]. Each car has six evaluationfactors: (1) G1 – oil consumption; (2) G2 – coefficient of friction; (3)G3 – price; (4) G4 – comfortable degree; (5) G5 – design; (6) G6 –safety coefficient. The evaluation results of each car with respect to
Page 4: A netting clustering analysis method under intuitionistic fuzzy environment

Z. Wang et al. / Applied Soft Computing 11 (2011) 5558–5564 5561

Table 1The characteristics information of the cars.

G1 G2 G3 G4 G5 G6

Y1 (0.3, 0.5) (0.6, 0.1) (0.4, 0.3) (0.8, 0.1) (0.1, 0.6) (0.5, 0.4)Y2 (0.6, 0.3) (0.5, 0.2) (0.6, 0.1) (0.7, 0.1) (0.3, 0.6) (0.4, 0.3)

0.1)

0.0)

.3)

ti

m

Y3 (0.4, 0.4) (0.8, 0.1) (0.5,

Y4 (0.2, 0.4) (0.4, 0.1) (0.9,

Y5 (0.5, 0.2) (0.3, 0.6) (0.6.0

he factors Gj (j = 1, 2, . . ., 6) are represented by the IFSs, shown asn Table 1.

In the following, we utilize the intuitionistic fuzzy nettingethod to classify the five cars, which involves the following steps.

Step 1. By (6), we calculate

�̇12 = 1 − 16

6∑j=1

|v1j − v2j | − 16

6∑j=1

|�1j − �2j |

= 1 − 16

(0.2 + 0.1 + 0.2 + 0.0 + 0.0 + 0.1)

− 16

(0.1 + 0.0 + 0.0 + 0.1 + 0.2 + 0.2)

= 0.8

v̇12 = 16

(0.2 + 0.1 + 0.2 + 0.0 + 0.0 + 0.1) = 0.1

and then calculate the others in a similar way. Consequently, weget the intuitionistic fuzzy similarity matrix:

Z =

⎛⎜⎝

(1, 0) (0.8, 0.1) (0.72, 0.12) (0.75, 0.13) (0.65, 0.22)(0.8, 0.1) (1, 0) (0.82, 0.08) (0.72, 0.1) (0.68, 0.18)

(0.72, 0.12) (0.82, 0.08) (1, 0) (0.7, 0.05) (0.63, 0.23)(0.75, 0.13) (0.72, 0.1) (0.7, 0.05) (1, 0) (0.63, 0.25)(0.65, 0.22) (0.68, 0.18) (0.63, 0.23) (0.63, 0.25) (1, 0)

⎞⎟⎠

Step 2. Delete all the elements above the diagonal and replace theelements on the diagonal in Z with the symbol of the alternativesYi (i = 1, 2, 3, 4, 5):

Z ′ =

⎛⎜⎝

Y1

(0.8, 0.1) Y2

(0.72, 0.12) (0.82, 0.08) Y3

(0.75, 0.13) (0.72, 0.1) (0.7, 0.05) Y4

(0.65, 0.22) (0.68, 0.18) (0.63, 0.23) (0.63, 0.25) Y5

⎞⎟⎠

Step 3. Choose the confidence level � properly, and get the cor-responding clustering results with intuitionistic fuzzy nettingmethod:(1) When 0.82 < � ≤ 1.0, we have

Z ′′ =

⎛⎜⎜⎝

Y1Y2

Y3Y4

Y5

⎞⎟⎟⎠

and then each car is clustered into a class: {Y1} , {Y2} , {Y3} ,{Y4} , {Y5}.

(2) When 0.8 < � ≤ 0.82, we have1

2

3

YYYZ''

Y

⎞⎛⎟⎜⎟⎜⎟⎜=⎟⎜

4

5Y⎟⎜⎟⎜⎠⎝

and then the cars Yi (i = 1, 2, 3, 4, 5) are clustered into followingfour classes: {Y1} , {Y2, Y3} , {Y4} , {Y5}.

(0.6, 0.2) (0.4, 0.5) (0.3, 0.2)(0.8, 0.1) (0.2.0.5) (0.7, 0.1)(0.7, 0.1) (0.6, 0.2) (0.5, 0.3)

(3) When 0.75 < � ≤ 0.8, we have

1

2

3

4

5

*

YYYZ''

YY

⎞⎛⎟⎜⎟⎜⎟⎜=⎟⎜⎟⎜⎟⎜⎠⎝

and then the cars Yi (i = 1, 2, 3, 4, 5) are clustered into threeclasses: {Y1, Y2, Y3} , {Y4} , {Y5}.

(4) When 0.72 < � ≤ 0.75, we have

1

2

3

4

5

**YYZ''

YY

⎞⎛Y⎟⎜⎟⎜⎟⎜=⎟⎜⎟⎜⎟⎜⎠⎝

and then the cars Yi (i = 1, 2, 3, 4, 5) are clustered into twoclasses: {Y1, Y2, Y3, Y4} , {Y5}.

(5) When 0.68 < � ≤ 0.72, we have the following two cases:

(i)

1

2

3

4

5

**

*

YY

YZ''Y

Y

⎞⎛⎟⎜⎟⎜⎟⎜=⎟⎜⎟⎜⎟⎜⎠⎝

In this case, the cars Yi (i = 1, 2, 3, 4, 5) are clustered intotwo classes: {Y1, Y2, Y3, Y4} , {Y5};

(ii)

1

2

3

4

5

****

YY

YZ''Y

Y

⎞⎛⎟⎜⎟⎜⎟⎜=⎟⎜⎟⎜⎟⎜⎠⎝

In this case, the cars Yi (i = 1, 2, 3, 4, 5) are also clustered intotwo classes: {Y1, Y2, Y3, Y4} , {Y5}.

(6) When 0.65 < � ≤ 0.68, we have

1

2

4

5

******

YY

YZ''=Y

Y

⎞⎛⎟⎜⎟⎜⎟⎜⎟⎜⎟⎜⎟⎜⎠⎝

3

and then the cars Yi (i = 1, 2, 3, 4, 5) are clustered into oneclass: {Y1, Y2, Y3, Y4, Y5}.

In the following, let us make a simple comparison between thedeveloped method in this paper and Zhang et al.’s method [15] inTable 2.

Through Table 2, we know that the intuitionistic fuzzy nettingmethod has some desirable advantages over Zhang et al.’s method:

(1) It does not need to calculate the equivalent matrix, and thusrequires much less computational efforts; (2) It can derive moredetailed clustering results. Therefore, Compared to Zhang et al.’smethod, our method has more prospects for practical applications.
Page 5: A netting clustering analysis method under intuitionistic fuzzy environment

5562 Z. Wang et al. / Applied Soft Computing 11 (2011) 5558–5564

Table 2Comparisons of the derived results.

Classes The result derived by intuitionistic fuzzy netting method The result developed by Zhang et al.’s method [15]

5 {Y1} , {Y2} , {Y3} , {Y4} , {Y5} {Y1} , {Y2} , {Y3} , {Y4} , {Y5}4 {Y1} , {Y2, Y3} , {Y4} , {Y5}3 {Y1, Y2, Y3} , {Y4} , {Y5} {Y1, Y2, Y3} , {Y4} , {Y5}2 {Y1, Y2, Y3, Y4} , {Y5}1 {Y1, Y2, Y3, Y4, Y5} {Y1, Y2, Y3, Y4, Y5}

Table 3The characteristics of the cars.

G1 G2 G3 G4 G5 G6

Y1 (0.8, 0.1) (0.4, 0.1) (0.6, 0.1) (0.7, 0.3) (0.6, 0.2) (0.5, 0.0)Y2 (0.0, 0.3) (0.1, 0.3) (0.0, 0.6) (0.0, 0.5) (0.5, 0.3) (0.4, 0.2)Y3 (0.2, 0.0) (0.9, 0.1) (0.0, 0.7) (0.0, 0.1) (0.3, 0.2) (0.8, 0.2)Y4 (0.0, 0.5) (0.3, 0.0) (0.7, 0.1) (0.6, 0.1) (0.0.0.7) (0.7, 0.2)Y5 (0.4, 0.6) (0.2, 0.4) (0.9.0.1) (0.6, 0.1) (0.7, 0.2) (0.7, 0.3)Y6 (0.0, 0.2) (0.0, 0.0) (0.5.0.4) (0.5, 0.4) (0.3, 0.6) (0.0, 0.0)Y7 (0.8, 0.1) (0.2, 0.1) (0.1.0.0) (0.7, 0.0) (0.6, 0.4) (0.0, 0.6)Y8 (0.1, 0.7) (0.0, 0.5) (0.8.0.1) (0.7, 0.1) (0.7, 0.1) (0.0, 0.0)Y9 (0.0, 0.1) (0.5, 0.1) (0.3.0.1) (0.7, 0.3) (0.1, 0.3) (0.7, 0.2)

0.2) (0.2, 0.0) (0.1, 0.9) (0.9, 0.1)

peimIfsntlZsstitu[

ms

td

EpfNfia

(

Table 4The clustering results with the netting method.

�level Clustering

0.67 < � ≤ 1 {A1}, {A2}, {A3}, {A4}, {A5}, {A6}, {A7}, {A8}, {A9}, {A10}0.63 < � ≤ 0.67 {AA4, A9}, {A1}, {A2}, {A3}, {A5}, {A6}, {A7}, {A8}, {A10}(0.63,0.14) < � ≤ (0.63,0.08)

{A1, A5}, {A4, A9}, {A2}, {A3}, {A6}, {A7}, {A8}, {A10}

0.57 < � ≤ 0.63 {A1, A5}, {A4, A9, A10}, {A2}, {A3}, {A6}, {A7}, {A8}(0.57,0.22) < � ≤ (0.57,0.08)

{A1, A5}, {A4, A9, A10}, {A2, A6}, {A3}, {A7}, {A8}

0.55 < � ≤ 0.57 {A1, A5}, {A2, A4, A6, A9, A10}, {A3}, {A7}, {A8}0.49 < � ≤ 0.55 {A1, A5, A7}, {A2, A4, A6, A8, A9, A10}, {A3}(0.49,0.16) < � ≤ (0.49,0.08)

{A1, A5, A7}, {A2, A3, A4, A6, A8, A9, A10}

0 < � ≤ 0.49 {A1, A2, A3, A4, A5, A6, A7, A8, A9, A10}

Table 5The clustering results with Zhang et al.’s method.

�level Clustering

0.67 < � ≤ 1 {A1}, {A2}, {A3}, {A4}, {A5}, {A6}, {A7}, {A8}, {A9}, {A10}0.63 < � ≤ 0.67 {A4, A9}, {A1}, {A2}, {A3}, {A5}, {A6}, {A7}, {A8}, {A10}0.57 < � ≤ 0.63 {A1, A5}, {A4, A9, A10}, {A2}, {A3}, {A6}, {A7}, {A8}0.55 < � ≤ 0.57 {A1, A5}, {A2, A4, A6, A9, A10}, {A3}, {A7}, {A8}

Y10 (0.3, 0.2) (0.7, 0.1) (0.2.

Why our method has these characteristics? For one thing, theroposed netting method can rely on similarity relation instead ofquivalent relation as in fuzzy environment. For another, whethern [15] or in this work, the class stander are all based on �-cutting

atrix, so � is an important parameter to decide the class scalar.n [15], before getting the �-cutting matrix, Zhang et al. first trans-ormed the intuitionistic fuzzy matrix into an intuitionistic fuzzyimilarity matrix, and then calculated its equivalent matrix whicheeds lots of computational efforts. In this work, we not only gethe �-cutting matrix directly from the intuitionistic fuzzy simi-arity matrix, but also improve the principle of choosing �. Sincehang et al.’s work needs to transform the intuitionistic fuzzyimilarity matrix into an intuitionistic fuzzy equivalent matrix,ome information maybe missing during this process. Namely,he intuitionistic fuzzy equivalent matrix cannot reflect all thenformation that the intuitionistic fuzzy similarity matrix con-ains. Considering the stated reasons above, it is not hard fors to comprehend why we can get more detailed classes than15].

This work only makes a comparison with [15], because that theethod in [15] is a representation of solving this class of problems,

ome closely related results can be found in [16,17].In order to demonstrate the effectiveness of the proposed clus-

ering algorithm, we further conduct experiments with simulatedata through comparing these two methods.

xample 2. As we have explained above, the computational com-lexity is mainly related with the computations of intuitionisticuzzy similarity matrix and intuitionistic fuzzy equivalent matrix.ext, we shall illustrate this with simulated experiments. Below werst introduce the experimental tool, the experimental data sets,nd then make a comparison with other method [15].

1) Experimental tool. In the experiments, we use the netting algo-rithm implemented by MATLAB. Note that if we let �(x) = 0for any x ∈ X, then the netting algorithm reduces to the tra-

ditional fuzzy netting algorithm. Therefore, we can use thisprocess to compare the performances of both the netting algo-rithm under intuitionistic fuzzy environment and the nettingalgorithm under fuzzy environment.

0.49 < � ≤ 0.55 {A1, A5, A7}, {A2, A3, A4, A6, A8, A9, A10}0 < � ≤ 0.49 {A1, A2, A3, A4, A5, A6, A7, A8, A9, A10}

(2) Experimental data sets. The car data set contains the informa-tion of ten new cars to be classified in an auto market. LetYi (i = 1, 2, . . ., 10) be the cars, each of which is described by sixattributes: (1) G1: oil consumption; (2) G2: coefficient of fric-tion; (3) G3: price; (4) G4: comfortable degree; (5) G5: design;(6) G6: safety coefficient, as in Example 1 (for convenience, herewe do not consider the weights of these attributes). The charac-teristics of the ten new cars under the six attributes, generatedat random by MATLAB, are represented by the IFSs, as shown inTable 3.

In order to express the validity of the netting method, we shallmake a comparison with Zhang et al.’s method [15].

Page 6: A netting clustering analysis method under intuitionistic fuzzy environment

Z. Wang et al. / Applied Soft Computing 11 (2011) 5558–5564 5563

Table 6Elapsed time for each method.

Methods Alternatives

10 50 100 500 1000 2000

0139316729

r

f

Z

0.14) 0.08)

0.16) 0.22) 0.22)

0)

0.22) 0.00)

0.08) 0.22)

wo

Z

(0.4(0.5(0.4(0.4(0.3

((0.4(0.5(0.4(0.5

Z

0) (08) (08) (00) (0

(08)

0) (08) (00) (08) (0

is

m

mtct

rargate

Netting method 0.000174 0.004637 0.Zhang et al.’s method 0.002361 0.035407 0.

With the netting method, we have the following clusteringesults as in Table 4.

Using Zhang et al.’s method, we first construct the intuitionisticuzzy similarity matrix based on the data in Table 3:

=

⎛⎜⎜⎜⎜⎜⎜⎜⎜⎝

(1, 0) (0.41, 0.08) (0.33, 0.24) (0.43, 0.08) (0.63, 0.08) (0.38,(0.41, 0.08) (1, 0) (0.41, 0.08) (0.49, 0.16) (0.36, 0.14) (0.57,(0.33, 0.24) (0.41, 0.08) (1, 0) (0.46, 0.08) (0.35, 0.08) (0.22,(0.43, 0.08) (0.49, 0.16) (0.46, 0.08) (1, 0) (0.49, 0.00) (0.33,(0. 63, 0.08) (0.36, 0.14) (0.35, 0.08) (0.49, 0.00) (1, 0) (0.27,(0.38, 0.14) (0.57, 0.08) (0.22, 0.16) (0.33, 0.22) (0.27, 0.22) (1,(0.55, 0.0) (0.41, 0.14) (0.43, 0.36) (0.43, 0.08) (0.30, 0.08) (0.38,

(0.46, 0.08) (0.43, 0.14) (0.25, 0.29) (0.33, 0.08) (0.27, 0.08) (0.55,(0.35, 0.0) (0.49, 0.16) (0.33, 0.08) (0.67, 0.00) (0.36, 0.08) (0.33,

(0.43, 0.24) (0.57, 0.22) (0.49, 0.08) (0.63, 0.14) (0.46, 0.16) (0.22,

In order to get the clustering result with Zhang et al.’s method,e should get the equivalent matrix. By the composition operation

f similarity matrices, we have

˜ = Z ◦ Z =

⎛⎜⎜⎜⎜⎜⎜⎜⎜⎝

(1, 0) (0.43, 0.08) (0.43, 0.08) (0.49, 0.00) (0.63, 0.08)

(0.43, 0.08) (1, 0) (0.49, 0.08) (0.57, 0.08) (0.49, 0.08)

(0.43, 0.08) (0.49, 0.08) (1, 0) (0.49, 0.08) (0.46, 0.08)

(0.49, 0.00) (0.57, 0.08) (0.49, 0.08) (1, 0) (0.49, 0.00)

(0. 63, 0.08) (0.49, 0.08) (0.46, 0.08) (0.49, 0.00) (1, 0)

(0.46, 0.08) (0.57, 0.08) (0.41, 0.08) (0.49, 0.08) (0.38, 0.08)

(0.55, 0.0) (0.43, 0.08) (0.43, 0.08) (0.43, 0.08) (0.55, 0.08)

(0.46, 0.08) (0.55, 0.08) (0.41, 0.08) (0.43, 0.08) (0.46, 0.08)

(0.43, 0.0) (0.49, 0.08) (0.46, 0.08) (0.67, 0.00) (0.49, 0.00)

(0.46, 0.08) (0.57, 0.08) (0.49, 0.08) (0.63, 0.08) (0.49, 0.08)

˜4 = Z2 ◦ Z2 =

⎛⎜⎜⎜⎜⎜⎜⎜⎜⎝

(1, 0) (0.49, 0.08) (0.49, 0.08) (0.49, 0.00) (0.63, 0.0(0.49, 0.08) (1, 0) (0.49, 0.08) (0.57, 0.08) (0.49, 0.0(0.49, 0.08) (0.49, 0.08) (1, 0) (0.49, 0.08) (0.49, 0.0(0.49, 0.00) (0.57, 0.08) (0.49, 0.08) (1, 0) (0.49, 0.0(0. 63, 0.00) (0.49, 0.08) (0.49, 0.08) (0.49, 0.00) (1, 0)

(0.49, 0.08) (0.57, 0.08) (0.49, 0.08) (0.57, 0.08) (0.49, 0.0(0.55, 0.00) (0.49, 0.08) (0.46, 0.08) (0.49, 0.00) (0.55, 0.0(0.46, 0.08) (0.55, 0.08) (0.49, 0.08) (0.55, 0.08) (0.49, 0.0(0.49, 0.0) (0.57, 0.08) (0.49, 0.08) (0.67, 0.00) (0.49, 0.0

(0.49, 0.08) (0.57, 0.08) (0.49, 0.08) (0.63, 0.08) (0.49, 0.0

After computation, we have Z̃8 = Z̃4, thus we can make cluster-ng analysis with Zhang et al.’s method. The clustering results arehown in Table 5.

We can see from Tables 5 and 6 that the netting method canake more detailed clustering results than Zhang et al.’s method.In order to illustrate the computation complexity, we generate a

ount of IFNs at random by MATLAB. Then we measure the compu-ation time before we get the corresponding matrix that can makelustering analysis for the two methods, respectively. The elapsedime (seconds) for each method is shown in Table 6.

Let n and m represent the amount of alternatives and attributes,espectively. Then the computational complexities of our methodnd Zhang et al.’s method are O(mn + 12n2) and O(mn + 12n2 + kn2),espectively, where k(k ≥ 2) represents the transfer times until we

et the equivalent matrix. The elapsed time may become closeds n increased. Considering the practical application, we thinkhe netting method can save much more time and computationalfforts.

3 1.585204 11.721117 102.4725925 10.636214 78.620455 691.554396

(0.55, 0.00) (0.46, 0.08) (0.35, 0.00) (0.43, 0.24) (0.41, 0.14) (0.43, 0.14) (0.49, 0.16) (0.57, 0.22)

(0.43, 0.36) (0.25, 0.29) (0.33, 0.08) (0.49, 0.08) (0.43, 0.08) (0.33, 0.08) (0.67, 0.00) (0.63, 0.14) (0.30, 0.08) (0.27, 0.08) (0.36, 0.08) (0.46, 0.16)

(0.38, 0.22) (0.55, 0.00) (0.33, 0.08) (0.22, 0.22) (1, 0) (0.38, 0.08) (0.34, 0.21) (0.36, 0.22)

(0.38, 0.08) (1, 0) (0.33, 0.16) (0.22, 0.36) (0.35, 0.22) (0.33, 0.16) (1, 0) (0.43, 0.08)

(0.36, 0.22) (0.22, 0.36) (0.43, 0.08) (1, 0)

⎞⎟⎟⎟⎟⎟⎟⎟⎟⎠

6, 0.08) (0.55, 0.00) (0.46, 0.08) (0.43, 0.00) (0.46, 0.08)7, 0.08) (0.43, 0.08) (0.55, 0.08) (0.49, 0.08 (0.57, 0.08)1, 0.08) (0.43, 0.08) (0.41, 0.08) (0.46, 0.08) (0.49, 0.08)9, 0.08) (0.43, 0.08) (0.43, 0.08) (0.67, 0.00) (0.63, 0.08)8, 0.08) (0.55, 0.08) (0.46, 0.08) (0.49, 0.00) (0.49, 0.08)1, 0) (0.41, 0.08) (0.55, 0.00) (0.49, 0.08) (0.57, 0.08)1, 0.08) (1, 0) (0.46, 0.08) (0.43, 0.00) (0.43, 0.14)5, 0.00) (0.46, 0.08) (1, 0) (0.43, 0.08) (0.43, 0.14)9, 0.08) (0.43, 0.00) (0.43, 0.08) (1, 0) (0.63, 0.08)7, 0.08) (0.43, 0.14) (0.43, 0.14) (0.63, 0.08) (1, 0)

⎞⎟⎟⎟⎟⎟⎟⎟⎟⎠

.49, 0.08) (0.55, 0.00) (0.46, 0.08) (0.49, 0.00) (0.49, 0.08)

.57, 0.08) (0.49, 0.08) (0.55, 0.08) (0.57, 0.08) (0.57, 0.08)

.49, 0.08) (0.46, 0.08) (0.49, 0.08) (0.49, 0.08) (0.49, 0.08)

.57, 0.08) (0.49, 0.00) (0.55, 0.08) (0.67, 0.00) (0.63, 0.08)

.49, 0.08) (0.55, 0.00) (0.49, 0.08) (0.49, 0.00) (0.49, 0.08)(1, 0) (0.46, 0.08) (0.55, 0.00) (0.57, 0.08) (0.57, 0.08)

.46, 0.08) (1, 0) (0.46, 0.08) (0.49, 0.00) (0.49, 0.08)

.55, 0.00) (0.46, 0.08) (1, 0) (0.49, 0.08) (0.55, 0.08)

.57, 0.08) (0.49, 0.00) (0.49, 0.08) (1, 0) (0.63, 0.08)

.57, 0.08) (0.49, 0.08) (0.55, 0.08) (0.63, 0.08) (1, 0)

⎞⎟⎟⎟⎟⎟⎟⎟⎟⎠

6. Concluding remarks

We have presented a new way to measure the intuitionisticfuzzy similarity degree between

two intuitionistic fuzzy sets, and applied it to construct theintuitionistic fuzzy similarity matrix. Based on the netting tech-nique, we have developed a method for clustering the objectswhich are represented by intuitionistic fuzzy information. Inthe process of clustering analysis, we have given the princi-ple to choose the confidence levels so as to get much betterclustering results. With the numerical examples, we have foundthat the intuitionistic fuzzy netting method can not only needless computational efforts, but also get more detailed clus-tering results than the existing intuitionistic fuzzy clusteringmethod.

Acknowledgements

The authors are very grateful to the Editor-in-Chief, ProfessorRajkumar Roy, and the anonymous reviewers for their insightful

Page 7: A netting clustering analysis method under intuitionistic fuzzy environment

5 Comp

ai

R

[

[

[

[

[

[

[

[

[

[

564 Z. Wang et al. / Applied Soft

nd constructive comments and suggestions that have led to anmproved version of this paper.

eferences

[1] K. Atanassov, Intuitionistic fuzzy set, Fuzzy Sets and Systems 20 (1986) 87–96.[2] L.A. Zadeh, Fuzzy sets, Information and Control 8 (1965) 338–356.[3] K. Atanassov, G. Pasi, R.R. Yager, Intuitionistic fuzzy interpretations of

multi-criteria multi-person and multi-measurement tool decision making,International Journal of Systems Science 36 (2005) 859–868.

[4] H.W. Liu, G.J. Wang, Multi-criteria decision-making methods based on intu-itionistic fuzzy sets, European Journal of Operational Research 179 (2007)220–233.

[5] Z.S. Xu, R.R. Yager, Dynamic intuitionistic fuzzy multiple attribute decisionmaking, International Journal of Approximate Reasoning 48 (2008) 246–262.

[6] Z.S. Xu, A distance measure based method for interval-valued intuitionisticfuzzy multi-attribute group decision making, Information Sciences 180 (2010)181–190.

[7] D.H. Hong, C.H. Choi, Multicriteria fuzzy decision-making problems based on

vague set theory, Fuzzy Sets and Systems 114 (2000) 103–113.

[8] W.L. Hung, M.S. Yang, Similarity measures of intuitionistic fuzzy sets based onHausdorff distance, Pattern Recognition Letters 25 (2004) 1603–1611.

[9] K.I. Vlachos, G.D. Sergiadis, Intuitionistic fuzzy information-applications to pat-tern recognition, Pattern Recognition Letters 28 (2007) 197–206.

[

[

uting 11 (2011) 5558–5564

10] W.L. Hung, M.S. Yang, Similarity measures of intuitionistic fuzzy sets based onLq metric, International Journal of Approximate Reasoning 46 (2007) 120–136.

11] Z.Z. Liang, P.F. Shi, Similarity measures on intuitionistic fuzzy sets, PatternRecognition Letters 24 (2003) 2687–2693.

12] V. Khatibi, G.A. Montazer, Intuitionistic fuzzy set vs. fuzzy set applicationin medical pattern recognition, Artificial Intelligence in Medicine 47 (2009)43–52.

13] S.K. De, R. Biswas, A.R. Roy, An application of intuitionistic fuzzy sets in medicaldiagnosis, Fuzzy Sets and Systems 117 (2001) 209–213.

14] M.R. Anderberg, Cluster Analysis for Applications, Academic Press, New York,1972.

15] H.M. Zhang, Z.S. Xu, Q. Chen, On clustering approach to intuitionistic fuzzy sets,Control and Decision 22 (2007) 882–888.

16] Z.S. Xu, J. Chen, J.J. Wu, Clustering algorithm of intuitionistic fuzzy sets, Infor-mation Sciences 178 (2008) 3775–3790.

17] R. Cai, Y.J. Lei, X.J. Zhao, Clustering method based on intuitionistic fuzzy equiva-lent dissimilarity matrix, Journal of Computer Applications 29 (2009) 123–126.

18] Z.S. Xu, Intuitionistic fuzzy aggregation operators, IEEE Transactions on FuzzySystems 15 (2007) 1179–1187.

19] Z.S. Xu, R.R. Yager, Some geometric aggregation operators based on intuition-

istic fuzzy sets, International Journal of General Systems 35 (2006) 417–433.

20] H. Bustince, Construction of intuitionistic fuzzy relations with predeterminedproperties, Fuzzy Sets and Systems 109 (2000) 379–403.

21] Z.X. He, Fuzzy Mathematics and Its Application, Tianjin Science and Technologypress, Tianjing, 1983.