Pertemuan 9 JARINGAN LEARNING VECTOR QUANTIZATION

12
1 Pertemuan 9 JARINGAN LEARNING VECTOR QUANTIZATION Matakuliah : H0434/Jaringan Syaraf Tiruan Tahun : 2005 Versi : 1

description

Pertemuan 9 JARINGAN LEARNING VECTOR QUANTIZATION. Matakuliah: H0434/Jaringan Syaraf Tiruan Tahun: 2005 Versi: 1. Learning Outcomes. Pada akhir pertemuan ini, diharapkan mahasiswa akan mampu : Mendemonstrasikan Jaringan Learning Vector Quantization. Outline Materi. - PowerPoint PPT Presentation

Transcript of Pertemuan 9 JARINGAN LEARNING VECTOR QUANTIZATION

Page 1: Pertemuan 9  JARINGAN LEARNING VECTOR QUANTIZATION

1

Pertemuan 9

JARINGAN LEARNING VECTOR QUANTIZATION

Matakuliah : H0434/Jaringan Syaraf Tiruan

Tahun : 2005

Versi : 1

Page 2: Pertemuan 9  JARINGAN LEARNING VECTOR QUANTIZATION

2

Learning Outcomes

Pada akhir pertemuan ini, diharapkan mahasiswa

akan mampu :

• Mendemonstrasikan Jaringan Learning Vector Quantization

Page 3: Pertemuan 9  JARINGAN LEARNING VECTOR QUANTIZATION

3

Outline Materi

• Arsitektur Jaringan

• Learning Rule

Page 4: Pertemuan 9  JARINGAN LEARNING VECTOR QUANTIZATION

4

Learning Vector Quantization

The net input is not computed by taking an inner product of the prototype vectors with the input. Instead, the net input is the negative of the distance between the prototype vectors and the input.

Page 5: Pertemuan 9  JARINGAN LEARNING VECTOR QUANTIZATION

5

Subclass

For the LVQ network, the winning neuron in the first layerindicates the subclass which the input vector belongs to. Theremay be several different neurons (subclasses) which make upeach class.

The second layer of the LVQ network combines subclasses intoa single class. The columns of W2 represent subclasses, and the rows represent classes. W2 has a single 1 in each column, withthe other elements set to zero. The row in which the 1 occurs indicates which class the appropriate subclass belongs to.

w k i2

1= subclass i is a part of class k

Page 6: Pertemuan 9  JARINGAN LEARNING VECTOR QUANTIZATION

6

Example

W21 0 1 1 0 00 1 0 0 0 0

0 0 0 0 1 1

=

• Subclasses 1, 3 and 4 belong to class 1.

• Subclass 2 belongs to class 2.

• Subclasses 5 and 6 belong to class 3.

A single-layer competitive network can create convex classification regions. The second layer of the LVQ network can combine the convex regions to create more complex categories.

Page 7: Pertemuan 9  JARINGAN LEARNING VECTOR QUANTIZATION

7

LVQ Learning

w1

i q w1

i q 1– p q w1

i q 1– – += ak2

tk 1= =

w1i q w1

i q 1– p q w1i q 1– – –= a

k2

1 tk 0= =

If the input pattern is classified correctly, then move the winningweight toward the input vector according to the Kohonen rule.

If the input pattern is classified incorrectly, then move the winning weight away from the input vector.

LVQ learning combines competive learning with supervision.It requires a training set of examples of proper network behavior.

p1 t1 p2 t2 pQ tQ

Page 8: Pertemuan 9  JARINGAN LEARNING VECTOR QUANTIZATION

8

Example

p10

1= t1

1

0=

p40

0= t4

0

1=

W2 1 1 0 0

0 0 1 1=W

10

w11

T

w12

T

w1

3 T

w1

4 T

0.25 0.75

0.75 0.751 0.25

0.5 0.25

= =

p21

0= t2

0

1=

p31

1= t3

1

0=

Page 9: Pertemuan 9  JARINGAN LEARNING VECTOR QUANTIZATION

9

First Iteration

a1 compet

0.25 0.75T

0 1T

––

0.75 0.75T

0 1T

––

1.00 0.25T

0 1T

––

0.50 0.25T

0 1T

––

compet

0.354–

0.791–

1.25–0.901–

1

0

00

= = =

a1 compet n1 compet

w1

1 p1––

w1

2 p1––

w13 p1––

w14 p1––

= =

Page 10: Pertemuan 9  JARINGAN LEARNING VECTOR QUANTIZATION

10

Second Layer

a2 W2a1 1 1 0 0

0 0 1 1

1

0

00

1

0= = =

w1

1 1 w1

1 0 p1 w1

1 0 – +=

w1

1 1 0.250.75

0.5 01

0.250.75

+ 0.1250.875

= =

This is the correct class, therefore the weight vector is movedtoward the input vector.

Page 11: Pertemuan 9  JARINGAN LEARNING VECTOR QUANTIZATION

11

Figure

Page 12: Pertemuan 9  JARINGAN LEARNING VECTOR QUANTIZATION

12

Final Decision Regions