Neuro-fuzzy modeling between minimum and maximum Claudio Moraga University of Dortmund Germany © cm...

Post on 20-Dec-2015

218 views 2 download

Tags:

Transcript of Neuro-fuzzy modeling between minimum and maximum Claudio Moraga University of Dortmund Germany © cm...

Neuro-fuzzy modelingbetween

minimum and maximum Claudio Moraga

University of DortmundGermany

© cm

Universidad Técnica Federico Santa María

Valparaíso, Marzo, 2001

Outline

• Motivation

• Data-driven modeling

• Das ANFIS system

• Compensating systems

• Symmetric sums and S-functions

• Rules interpretation

• Conclusions

© cm

© cm

Motivation

Data-driven fuzzy modeling

• Generating fuzzy rules from examples:

Method of L.X. Wang and J.M. Mendel

• .......

• Neuro-fuzzy extraction of rules from examples:

Using feedforward neural networks with appropriate architecture

© cm

“The goal“

© cm

Fuzzy

IT-Rules

Neural

Network

Let L(x) denote the number of linguistic terms associated to x. The extracted rule base has L(x1)·L(x2) rules, but not necessarily as many different conclusions!

ANFIS-like rule extraction

© cm

If x1 is T1j and x2 is T2i

then

then

x1

x2

T1j

T2ix

x

x

x

x

x

x

x

x

conclusion

Tjk(xi) and

Analysis of ANFIS

Advantages of ANFIS:

• ANFIS is a very good system to extract numerical models from numerical data

• ANFIS allows in principle the extraction of fuzzy rules from numerical data, but this aspect has not been further developed

Drawbacks of ANFIS:

• The user has to define a priori the number of linguistic terms to be considered for the variables

• The conjunction of premises is based on (differentiable) t-norms, i.e. they are strickter than minimum and they induce a grid-like partition of the input space

© cm

© cm

x1 x2

x1 x2

Evolutionary front-end

The golden rule of Soccer for a Coach

“ If a player runs (with the ball) 100 m in 10 sec and he is a dribbling king and where he sees a free spot of the net he shoots the ball and the transfer fee is reasonable

then

go get him!! ”

© cm

Table 1: decision table based on t-norms

Player 1 Player 2 Player 3 Player 1 Player 2 Player 3

Speed 0.8 0.7 0.9 0.8 0.7 0.9

Dribbling 0.8 0.5 0.8 0.8 0.5 0.8

Shooting 0.9 0.9 0.5 0.9 0.9 0.5

Low Fee 0.5 0.9 0.7 0.5 0.9 0.7

Result 0.288 0.283 0.252 0.5 0.5 0.5

t-norm product minimum

© cm

Compensating systems. (“The world between min and max“)

• Combination of t-norms and t-conorms. . e.g.:

• -operators (Zimmermann and Zysno)

• Weighted min and max (Dubois)

• Symmetric Sums (Silbert, Dombi)

• Generalized average operators. . e.g.:

• Ordered weighted average (OWA)

• Weighted ordered weighted average (WOWA)

• Quasi-linear average

© cm

x2

T2i

x+

x+

x+

x+

x+

x1

T1j

x+

x+

x+

x+ +

+1

-1+

1-3

3

aggr3[T11(x1), T23(x2)]

Let y1 = T11(x1) and y2 = T23(x2)

aggr(y1,y2) = t(y1,y2) + (1- )t*(y1,y2)

with t(y1,y2) = y1y2

t*(y1,y2) = y1 + y2 - y1y2

Learning the -Operator with a neural network

© cm

Generalized weighted operators of Dubois

Let w = [w1, w2, ..., wn] where wi [0,1], 1 < i < n.

Then yi [0,1], 1 < i < n; t a t-norm and s ist dual t-conorm:

 tw(y1, ..., yn) = t( s(y1,1-w1), s(y2, 1-w2), ... , s(yn, 1-wn) )

sw(y1, ..., yn) = s( t(y1, w1), t(y2, w2), ... , t(yn, wn) )

Example: n=2; let t be the product and s, the algebraic sum. Then:

tw(y1, y2) = (y1 + (1-w1) – y1(1-w1))·(y2 + (1-w2) – y2(1-w2))

= ((1-w1) + w1y1)·((1-w2) + w2y2 )

sw(y1, y2) = (w1y1 + w2y2 – w1w2y1y2 )© cm

x 1

x2

p_sumw (T1i(x1),T2j(x2))+ +.wi

wj

1

-1

++

+.

wi

1-wi

1-wj

wj

1

1

1

prod (T1i(x1), T2j(x2)) W

Tkl(xk)

Generalized weighted operators of Dubois in ANFIS

© cm

p_sumW(y1,y2) = (w1y1 + w2y2 – w1w2y1y2 )

prodW (y1, y2) =((1-w1) + w1y1)·((1-w2) + w2y2 )

Extended logistic function - Symmetric sum

))S(xS(x)]S(x)][1S(x[1

))S(xS(x)S(x)S(x

))S(xS(x)]S(x)][1S(x[1

))S(xS(x

)S(x

)S(x1

)S(x

)S(x11

1

kk1

1

k1

1)xS(x

S(x)

S(x)1k then

k1

1S(x) let 1k R, k and R x

2121

21def21

2121

21

2

2

1

1

xxxx21

x-x-

2121

represents a Symmetric Sum operation and gives a non-linear combination of a t-norm and a t-conorm.

Moreover ((0,1), ) is an abelian group with identity ½ and inverse 1- ( ). © cm

0 1

1s

t

Interpretation

Neural network Fuzzy logic interpretation

The activation function S(j)(wijxi) represents the membership function of the

j-th fuzzy set –(linguistic term)– associated to e.g. the weighted i-th input.

© cm

S(k)(x)

S(2)(x)

S(1)(x)

xn

x2

x1

S(x)

S(1)(x)w21x2

wn1xn

w11x1

S(1)

The weight wij affects the slope of S(j), thus acting as a linguistic modifier.

S(k)(x)

S(2)(x)

S(1)(x)

xn

x2

x1

S(x)

S(1)(x)w21x2

wn1xn

w11x1

S(1)

Neural network Fuzzy logic interpretation

The value of the i-th input represents the value of the i-th premise and the

value of the corresponding conclusion will be obtained as the symmetric

summation of the degrees of satisfaction of the modified linguistic terms

induced by the premises.

© cm

S(k)(x)

S(2)(x)

S(1)(x)

xn

x2

x1

S(x)

S(1)(x)w21x2

wn1xn

w11x1

S(1)

Neural network Fuzzy logic interpretation

Let yj = S(j)(w ·x) = S(j)(w1j x1 + w2j x2 + ... + wnj xn)

if x1 is in S(j) and ... and xn is in S(j) then yj = S(j)

(x1) ... S(j) (xn)w1jwnj w1j wnj

© cm

Table 2. Player selection with

Player 1 Player 2 Player 3

Speed 0.8 0.7 0.9

Dribbling 0.8 0.5 0.8

Shooting 0.9 0.9 0.5

Low Fee 0.5 0.9 0.7

Result 0.9931 0.9947 0.9882

Connective -operator

© cm

S-functions

Definition: Let f : R (0,1) be continuous and strictly monotone increasing, such that:

• limx - f(x) = 0

• limx + f(x) = 1

• x R f(-x) = 1 - f(x),

then f is said to be an S-function.

Examples:

© cm

f(x) = 1/[1 + e-x] ; f(x) = 1/[1 + k-x]

f(x) = 1 – (1/)arccot(x)

f(x) = (1/2)[1 + x/(1 + |x|)]

S - Activation Functions

Theorem:A neural network using S-functions as activation functions have

the property of universal approximation.

Definition:Let f be an S-function. Moreover x1, x2 R, let f(x1) = vx1 and

f(x2) = vx2. Then:

f(x1) f(x2) =def f( f -1(vx1) + f -1(vx2) )

is an aggregation operator -(the general form of a symmetric summation)- and f is its generating function.

cm

Examples ( with f = [1 + k-x-y]-1 )

00.2

0.40.6

0.81 0

0.2

0.4

0.6

0.8

1

00.20.40.60.81

00.2

0.40.6

0.81

k=5

k=1.5

f(x) f(y)cm

Conclusions

• There are real-world problems of compensating type, which cannot be properly modelled with t-norms

• Feedforward neural networks with S-activation functions may be used to extract compensating fuzzy if-then rules, where the premises are combined with a symmetric sum

• The extracted rules explain the role of the hidden nodes of the neural network, i.e. neural networks (of the above class) are no longer „black boxes“

• The ANFIS-Architecture may be extended to allow extracting the parameter of the linear combination of a t and a t* and to learn weighted operators

© cm