Principle Of Artificial Neural Networks - ToC

download Principle Of Artificial Neural Networks - ToC

of 18

Transcript of Principle Of Artificial Neural Networks - ToC

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    1/18

    PRINCIPLES OF

    ARTIFICIAL NEURALNET WORKS

    3rd Edition

    P r i n c

    i p l e s

    o f A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    2/18

    ADVANCED SERIES IN CIRCUITS AND SYSTEMS

    Editor-in-Charge: Wai-Kai Chen (Univ. Illinois, Chicago, USA) Associate Editor: Dieter A. Mlynski (Univ. Karlsruhe, Germany)

    Published

    Vol. 1: Interval Methods for Circuit Analysisby L. V. Kolev

    Vol. 2: Network Scattering Parametersby R. Mavaddat

    Vol. 3: Principles of Artificial Neural Networksby D Graupe

    Vol. 4: Computer-Aided Design of Communication Networksby Y-S Zhu and W K Chen

    Vol. 5: Feedback Networks: Theory and Circuit Applicationsby J Choma and W K Chen

    Vol. 6: Principles of Artificial Neural Networks, Second Editionby D Graupe

    Vol. 7: Principles of Artificial Neural Networks, Third Editionby D Graupe

    P r i n c

    i p l e s o f

    A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    3/18

    N E W J E R S E Y • L O N D O N • S I N G A P O R E • B E I J I N G • S H A N G H A I • H O N G K O N G • TA I P E I • C H E N N A I

    World Scientic

    Advanced Series in Circuits and Systems – Vol. 7

    PRINCIPLES OFARTIFICIAL NEURAL

    NETWORKS3rd Edition

    Daniel GraupeUniversity of Illinois, Chicago, USA P

    r i n c

    i p l e s

    o f A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    4/18

    Published by

    World Scientific Publishing Co. Pte. Ltd.

    5 Toh Tuck Link, Singapore 596224

    USA office: 27 Warren Street, Suite 401-402, Hackensack, NJ 07601

    UK office: 57 Shelton Street, Covent Garden, London WC2H 9HE

    British Library Cataloguing-in-Publication DataA catalogue record for this book is available from the British Library.

    Advanced Series in Circuits and Systems — Vol. 7PRINCIPLES OF ARTIFICIAL NEURAL NETWORKSThird Edition

    Copyright © 2013 by World Scientific Publishing Co. Pte. Ltd.

    All rights reserved. This book, or parts thereof, may not be reproduced in any form or by any means, electronicor mechanical, including photocopying, recording or any information storage and retrieval system now knownor to be invented, without written permission from the Publisher.

    For photocopying of material in this volume, please pay a copying fee through the Copyright Clearance Center,Inc., 222 Rosewood Drive, Danvers, MA 01923, USA. In this case permission to photocopy is not required fromthe publisher.

    ISBN 978-981-4522-73-1

    Printed in Singapore

    P r i n c

    i p l e s o f

    A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    5/18

    Dedicated to the memory of my parents,to my wife Dalia,

    to our children, our daughters-in-law and our grandchildrenIt is also dedicated to the memory of Dr. Kate H. Kohn

    P r i n c

    i p l e s o f

    A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    6/18

    This page intentionally left blank This page intentionally left blank

    P r i n c

    i p l e s o f

    A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    7/18

    Acknowledgments

    I am most thankful to Hubert Kordylewski of the Department of ElectricalEngineering and Computer Science of the University of Illinois at Chicago for hishelp towards the development of LAMSTAR network of Chapter 9 of this text.Hubert helped me with his advise in all 3 editions of this book. I am grateful toseveral students who attended my classes on Neural Network at the Department of Electrical Engineering and Computer Science of the University of Illinois at Chicagoover the past fourteen years and who allowed me to append programs they wrote aspart of homework assignments and course projects to various chapters of this book.They are Vasanth Arunachalam, Abdulla Al-Otaibi, Giovanni Paolo Gibilisco, SangLee, Maxim Kolesnikov, Hubert Kordylewski, Alvin Ng, Eric North, Maha Nujeimo,Michele Panzeri, Silvio Rizzi, Padmagandha Sahoo, Daniele Scarpazza, SanjeebShah, Xiaoxiao Shi and Yunde Zhong.

    I am deeply indebted to the memory of Dr. Kate H. Kohn of Michael ReeseHospital, Chicago and of the College of Medicine of the University of Illinois atChicago and to Dr. Boris Vern of the College of Medicine of the University of Illinois at Chicago for reviewing parts of the manuscript of this text and for their

    helpful comments.Ms. Barbara Aman and the production and editorial staff at World ScienticPublishing Company in Singapore were extremely helpful and patient with me dur-ing all phases of preparing this book for print.

    Last but not least, my sincere thanks to Steven Patt, my Editor at World Sci-entic Publishing Company, throughout all editions of this book, for his continuoushelp and support.

    vii

    P r i n c

    i p l e s o f

    A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    8/18

    This page intentionally left blank This page intentionally left blank

    P r i n c

    i p l e s o f

    A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    9/18

    Preface to the Third Edition

    The Third Edition differs from the Second Edition in several importantaspects. I added 4 new detailed case studies describing a variety of applications,(new Sections 6.D, 7.C, 9.C, 9.D) together with their respective source codes. Thisbrings the total number of application to 19. This will allow the reader rst-handaccess to a wide range of different concrete APPLICATIONS of Neural Networksranging from medicine to constellation detection, thus establishing the main claimof Neural Networks, namely the claim of its GENERALITY.

    The new case studies include an application to a non-linear prediction problems(case study 9.C), which are indeed a eld where articial neural networks are andwill play a major role. This case study also compares performances of two differ-ent neural networks in terms of accuracy and computational time, for the specicproblem of that case study. Also, two new Section (9.6 and 9.8) were added toChapter 9.

    Text organization is also modied. The Chapter on the Large Memory Storageand Retrieval Neural Network (LAMSTAR) is moved from Chapter 13 of the Sec-ond Edition, to become Chapter 9 in the present Edition. Consequently, the old

    Chapter 9 to 12 are now Chapters 10 to 13. This allows teaching and self-study tofollow the main Articial Neural Networks (ANN) in a more logical order in termsof basic principles and generality. We consider that in short courses, Chapters 1 to9 will thus become the core of a course on ANN.

    It is hoped that these text and this enhanced Edition can serve to show andto persuade scientists, engineers and program developers in areas ranging frommedicine to nance and beyond, of the value and the power of ANN in problemsthat are ill-dened, highly non-linear, stochastic and of time-varying dynamics andwhich often appear to be beyond solution.

    Additional corrections and minor modications are also included, as are otherupdates based on recent developments including those relating to the author’sresearch.

    Daniel GraupeChicago, ILMarch 2013

    ix

    P r i n c

    i p l e s o f

    A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    10/18

    This page intentionally left blank This page intentionally left blank

    P r i n c

    i p l e s o f

    A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    11/18

    Preface to the Second Edition

    The Second Edition contains certain changes and additions to the FirstEdition. Apart from corrections of typos and insertion of minor additional detailsthat I considered to be helpful to the reader, I decided to interchange the order of Chapters 4 and 5 and to rewrite Chapter 13 so as to make it easier to apply theLAMSTAR neural network to practical applications. I also moved the Case Study6.D to become Case Study 4.A, since it is essentially a Perceptron solution.

    I consider the Case Studies important to a reader who wishes to see a concreteapplication of the neural networks considered in the text, including a completesource code for that particular application with explanations on organizing that ap-plication. Therefore, I replaced some of the older Case Studies with new ones withmore detail and using most current coding languages (MATLAB, Java, C++). Toallow better comparison between the various neural network architectures regardingperformance, robustness and programming effort, all Chapters dealing with majornetworks have a Case Study to solve the same problem, namely, character recogni-tion. Consequently, the Case studies 5.A (previously, 4.A, since the order of thesechapters is interchanged), 6.A (previously, 6.C), 7.A, 8.A, have all been replaced

    with new and more detailed Case Studies, all on character recognition in a 6×

    6grid. Case Studies on the same problem have been added to Chapter 9, 12 and13 as Case Studies 9.A, 12.A and 13.A (the old Case Studies 9.A and 13.A nowbecame 9.B and 13.B). Also, a Case Study 7.B on applying the Hopeld Network tothe well known Traveling Salesman Problem (TSP) was added to Chapter 7. OtherCase Studies remained as in the First Edition.

    I hope that these updates will add to the readers’ ability to better understandwhat Neural Networks can do, how they are applied and what the differences arebetween the different major architectures. I feel that this and the case studies with

    their source codes and the respective code-design details will help to ll a gap in theliterature available to a graduate student or to an advanced undergraduate Seniorwho is interested to study articial neural networks or to apply them.

    Above all, the text should enable the reader to grasp the very broad range of problems to which neural networks are applicable, especially those that defy analysisand/or are very complex, such as in medicine or nance. It (and its Case Studies)

    xi

    P r i n c

    i p l e s o f

    A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    12/18

    xii Principles of Articial and Neural Networks

    should also help the reader to understand that this is both doable and rather easilyprogrammable and executable.

    Daniel GraupeChicago, IL

    September 2006

    P r i n c

    i p l e s o f

    A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    13/18

    Preface to the First Edition

    This book evolved from the lecture notes of a rst-year graduate course entitled“Neural Networks” which I taught at the Department of Electrical Engineeringand Computer Science of the University of Illinois at Chicago over the years 1990–1996. Whereas that course was a rst-year graduate course, several Senior-Yearundergraduate students from different engineering departments, attended it withlittle difficulty. It was mainly for historical and scheduling reasons that the coursewas a graduate course, since no such course existed in our program of studies and inthe curricula of most U.S. universities in the Senior Year Undergraduate program. Itherefore consider this book, which closely follows these lecture notes, to be suitablefor such undergraduate students. Furthermore, it should be applicable to studentsat that level from essentially every science and engineering University department.Its prerequisites are the mathematical fundamentals in terms of some linear algebraand calculus, and computational programming skills (not limited to a particularprogramming language) that all such students possess.

    Indeed, I strongly believe that Neural Networks are a eld of both intellectualinterest and practical value to all such students and young professionals. Articial

    neural networks not only provide an understanding into an important computa-tional architecture and methodology, but they also provide an understanding (verysimplied, of course) of the mechanism of the biological neural network.

    Neural networks were until recently considered as a “toy” by many computerengineers and business executives. This was probably somewhat justied in thepast, since neural nets could at best apply to small memories that were analyzable just as successfully by other computational tools. I believe (and I tried in thelater chapters below to give some demonstration to support this belief) that neuralnetworks are indeed a valid, and presently, the only efficient tool, to deal with very

    large memories.The beauty of such nets is that they can allow and will in the near-future allow,

    for instance, a computer user to overcome slight errors in representation, in pro-gramming (missing a trivial but essential command such as a period or any othersymbol or character) and yet have the computer execute the command. This willobviously require a neural network buffer between the keyboard and the main pro-

    xiii

    P r i n c

    i p l e s o f

    A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    14/18

    xiv Principles of Articial and Neural Networks

    grams. It should allow browsing through the Internet with both fun and efficiency.Advances in VLSI realizations of neural networks should allow in the coming years

    many concrete applications in control, communications and medical devices, includ-ing in articial limbs and organs and in neural prostheses, such as neuromuscularstimulation aids in certain paralysis situations.

    For me as a teacher, it was remarkable to see how students with no backgroundin signal processing or pattern recognition could easily, a few weeks (10–15 hours)into the course, solve speech recognition, character identication and parameterestimation problems as in the case studies included in the text. Such computationalcapabilities make it clear to me that the merit in the neural network tool is huge.In any other class, students might need to spend many more hours in performingsuch tasks and will spend so much more computing time. Note that my studentsused only PCs for these tasks (for simulating all the networks concerned). Sincethe building blocks of neural nets are so simple, this becomes possible. And thissimplicity is the main feature of neural networks: A house y does not, to thebest of my knowledge, use advanced calculus to recognize a pattern (food, danger),nor does its CNS computer work in picosecond-cycle times. Researches into neuralnetworks try, therefore, to nd out why this is so. This leads and led to neuralnetwork theory and development, and is the guiding light to be followed in this

    exciting eld.Daniel Graupe

    Chicago, ILJanuary 1997

    P r i n c

    i p l e s o f

    A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    15/18

    Contents

    Acknowledgments vii

    Preface to the Third Edition ix

    Preface to the Second Edition xi

    Preface to the First Edition xiii

    Chapter 1. Introduction and Role of Articial Neural Networks 1

    Chapter 2. Fundamentals of Biological Neural Networks 5

    Chapter 3. Basic Principles of ANNs and Their Early Structures 93.1. Basic Principles of ANN Design . . . . . . . . . . . . . . . 93.2. Basic Network Structures . . . . . . . . . . . . . . . . . . 103.3. The Perceptron’s Input-Output Principles . . . . . . . . . 113.4. The Adaline (ALC) . . . . . . . . . . . . . . . . . . . . . 13

    Chapter 4. The Perceptron 174.1. The Basic Structure . . . . . . . . . . . . . . . . . . . . . 174.2. The Single-Layer Representation Problem . . . . . . . . . 224.3. The Limitations of the Single-Layer Perceptron . . . . . . 224.4. Many-Layer Perceptrons . . . . . . . . . . . . . . . . . . . 24

    4.A. Perceptron Case Study: Identifying AutoregressiveParameters of a Signal (AR Time Series Identication) . . 25

    Chapter 5. The Madaline 375.1. Madaline Training . . . . . . . . . . . . . . . . . . . . . . 375.A. Madaline Case Study: Character Recognition . . . . . . . 39

    xv

    P r i n c

    i p l e s o f

    A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    16/18

    xvi Principles of Articial and Neural Networks

    Chapter 6. Back Propagation 596.1. The Back Propagation Learning Procedure . . . . . . . . 596.2. Derivation of the BP Algorithm . . . . . . . . . . . . . . . 596.3. Modied BP Algorithms . . . . . . . . . . . . . . . . . . . 636.A. Back Propagation Case Study: Character Recognition . . 656.B. Back Propagation Case Study: The Exclusive-OR (XOR)

    Problem (2-Layer BP) . . . . . . . . . . . . . . . . . . . . 766.C. Back Propagation Case Study: The XOR Problem —

    3 Layer BP Network . . . . . . . . . . . . . . . . . . . . . 946.D. Average Monthly High and Low Temperature Prediction

    Using Backpropagation Neural Networks . . . . . . . . . . 112Chapter 7. Hopeld Networks 123

    7.1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . 1237.2. Binary Hopeld Networks . . . . . . . . . . . . . . . . . . 1237.3. Setting of Weights in Hopeld Nets — Bidirectional

    Associative Memory (BAM) Principle . . . . . . . . . . . 1257.4. Walsh Functions . . . . . . . . . . . . . . . . . . . . . . . 1277.5. Network Stability . . . . . . . . . . . . . . . . . . . . . . . 129

    7.6. Summary of the Procedure for Implementing theHopeld Network . . . . . . . . . . . . . . . . . . . . . . . 131

    7.7. Continuous Hopeld Models . . . . . . . . . . . . . . . . . 1327.8. The Continuous Energy (Lyapunov) Function . . . . . . . 1337.A. Hopeld Network Case Study: Character Recognition . . 1357.B. Hopeld Network Case Study: Traveling Salesman

    Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1477.C. Cell Shape Detection Using Neural Networks . . . . . . . 170

    Chapter 8. Counter Propagation 1858.1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . 1858.2. Kohonen Self-Organizing Map (SOM) Layer . . . . . . . . 1868.3. Grossberg Layer . . . . . . . . . . . . . . . . . . . . . . . 1868.4. Training of the Kohonen Layer . . . . . . . . . . . . . . . 1878.5. Training of Grossberg Layers . . . . . . . . . . . . . . . . 1898.6. The Combined Counter Propagation Network . . . . . . . 1908.A. Counter Propagation Network Case Study: Character

    Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . 190

    Chapter 9. Large Scale Memory Storage and Retrieval (LAMSTAR)Network 2039.1. Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . 2039.2. Basic Principles of the LAMSTAR Neural Network . . . . 204

    P r i n c

    i p l e s o f

    A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    17/18

    Contents xvii

    9.3. Detailed Outline of the LAMSTAR Network . . . . . . . . 2059.4. Forgetting Feature . . . . . . . . . . . . . . . . . . . . . . 211

    9.5. Training vs. Operational Runs . . . . . . . . . . . . . . . 2139.6. Operation in Face of Missing Data . . . . . . . . . . . . . 2139.7. Advanced Data Analysis Capabilities . . . . . . . . . . . . 2149.8. Modied Version: Normalized Weights . . . . . . . . . . . 2179.9. Concluding Comments and Discussion of Applicability . . 2189.A. LAMSTAR Network Case Study: Character

    Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . 2209.B. Application to Medical Diagnosis Problems . . . . . . . . 2369.C. Predicting Price Movement in Market Microstructure

    via LAMSTAR . . . . . . . . . . . . . . . . . . . . . . . . 2409.D. Constellation Recognition . . . . . . . . . . . . . . . . . . 253

    Chapter 10. Adaptive Resonance Theory 27510.1. Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . 27510.2. The ART Network Structure . . . . . . . . . . . . . . . . 27510.3. Setting-Up of the ART Network . . . . . . . . . . . . . . 27910.4. Network Operation . . . . . . . . . . . . . . . . . . . . . . 280

    10.5. Properties of ART . . . . . . . . . . . . . . . . . . . . . . 28110.6. Discussion and General Comments on ART-I and

    ART-II . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28310.A. ART-I Network Case Study: Character Recognition . . . 28310.B. ART-I Case Study: Speech Recognition . . . . . . . . . . 297

    Chapter 11. The Cognitron and the Neocognitron 30511.1. Background of the Cognitron . . . . . . . . . . . . . . . . 30511.2. The Basic Principles of the Cognitron . . . . . . . . . . . 30511.3. Network Operation . . . . . . . . . . . . . . . . . . . . . . 30511.4. Cognitron’s Network Training . . . . . . . . . . . . . . . . 30711.5. The Neocognitron . . . . . . . . . . . . . . . . . . . . . . 309

    Chapter 12. Statistical Training 31112.1. Fundamental Philosophy . . . . . . . . . . . . . . . . . . . 31112.2. Annealing Methods . . . . . . . . . . . . . . . . . . . . . . 31212.3. Simulated Annealing by Boltzman Training of Weights . . 312

    12.4. Stochastic Determination of Magnitude of WeightChange . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313

    12.5. Temperature-Equivalent Setting . . . . . . . . . . . . . . . 31312.6. Cauchy Training of Neural Network . . . . . . . . . . . . 31412.A. Statistical Training Case Study: A Stochastic Hopeld

    Network for Character Recognition . . . . . . . . . . . . . 315

    P r i n c

    i p l e s o f

    A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .

  • 8/18/2019 Principle Of Artificial Neural Networks - ToC

    18/18

    xviii Principles of Articial and Neural Networks

    12.B. Statistical Training Case Study: Identifying AR SignalParameters with a Stochastic Perceptron Model . . . . . . 318

    Chapter 13. Recurrent (Time Cycling) Back Propagation Networks 32713.1. Recurrent/Discrete Time Networks . . . . . . . . . . . . . 32713.2. Fully Recurrent Networks . . . . . . . . . . . . . . . . . . 32813.3. Continuously Recurrent Back Propagation Networks . . . 33013.A. Recurrent Back Propagation Case Study: Character

    Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . 330

    Problems 343

    References 349

    Author Index 357

    Subject Index 361

    P r i n c

    i p l e s o f

    A r t

    i f i c i a l N e u r a

    l N e t w o r

    k s D o w n l o a

    d e d f r o m

    w w w . w

    o r l d s c

    i e n t i f

    i c . c

    o m

    b y 4 1

    . 7 7 . 1 6

    . 6 9 o n

    0 3 / 2 5 / 1 4

    . F o r p e r s o n a l u s e o n

    l y .