Ch. Introduction - Islamic University of Gazasite.iugaza.edu.ps/mhanjouri/files/2013/02/Ch1.pdf ·...
Transcript of Ch. Introduction - Islamic University of Gazasite.iugaza.edu.ps/mhanjouri/files/2013/02/Ch1.pdf ·...
1st Ch. Introduction
Islamic University of Gaza
Computer Engineering Department
Graduate studies
ECOM 6343: Neural Networks Associate Professor: Mohammed Alhanjouri
This lecture is prepared from our textbook- ch1
Syllabus
Course Description:
Fundamental Concepts of Neural Computing, Terminology,
Main Neural Networks Architecture Single/Multilayer
Perceptrons, Feedback(Recurrent)/ Feedforward
Information Flow, Supervised/ Unsupervised Learning
Models, Backpropagation, Self-Organizing, Adaptive
Resonance, Auto/Heteroassociation Neural Memory
Models, Neurocomputing Implementation, Applications,
Performance Evaluation.
Prerequisite: ECOM 3312 Data Structures and Algorithms
Syllabus
Instructor: Associate Professor/ Mohammed Alhanjouri
Location: Administrative Building, 2nd floor, Room:226
E-mail: [email protected]
Office hours: Sunday, Tuesday from 11:00 to 12:00, and
one hour before our lecture
Lecture Time: every Wednesday from 14:00 to 17:00
All Material Will be downloaded on Moodle
Required Textbook:
M. Hagan, H. Demuth and M. Beale, Neural Network
Design, PWS Publishing Company, 1996.
Other Texts (not required):
1- Neural Networks: A Comprehensive Foundation (2nd edition),
by S. Haykin, Prentice Hall, 1999, 2005.
2- Neural Network Learning: Theoretical Foundations, by M.
Anthony and P. L. Bartlett, Cambridge Univ. Press 1999, 2009.
3- Neural Networks for Pattern Recognition by C. Bishop,
Oxford University Press, 1995
4- An introduction to neural networks, by B. Krose and P. van
der Smagt, Amsterdam university press, 1996.
5- Neural Networks, by R. Rojas, Springer-Verlag, Berlin, 1996
This Course will cover basic neural network architectures and
learning algorithms, for applications in pattern recognition,
image processing, and computer vision. Three forms of learning
will be introduced (i.e., supervised, unsupervised and
reinforcement learning) and applications of these will be
discussed. The students will have a chance to try out several of
these models on practical problems.
This is an advanced level course suited for graduate students in
Computer Science and Engineering. It is primarily intended for
highly motivated graduate students who are interested in doing
research in the areas of Neural Networks and Computer Vision.
There are many open problems in this areas suitable for
investigation by Master’s students leading to a professional
paper or master thesis.
Course Objectives
Course Outline (tentative)
• Introduction (Chapter 1)
• Neuron Model and Network Architectures (Chapter 2)
• An Illustrative Example (Chapter 3)
• Perceptron Learning Rule (Chapter 4)
• Background on Linear Algebra (Chapters 5 & 6)
• Supervised Hebbian Learning (Chapter 7)
• Background on performance surfaces and optimization (Ch.s 8&9)
• Widrow-Hoff Learning (Chapter 10)
• Backpropagation (Chapters 11 & 12)
• Associative Learning (Chapter 13)
• Competitive Networks (Chapter 14)
Course Policies
We will spend the first half of the semester with lectures
covering the fundamentals of Neural Networks.
During the second half of the semester, we will
emphasize applications of Neural Networks. For this,
each student is expected to choose a problem of his/her
interest, study several papers related to this problem,
and give a presentation to the rest of the class. The
presentation of the material should be professional as if
it was presented in a formal conference.
Every other student is expected to have idea about paper
so he can participate in the class discussion.
Each student is expected to complete a research project.
Projects will be done on an individual basis and no group
projects will be allowed. Ideally, the project will target a
problem relevant to the graduate student’s own research
interests (assuming they are related to Computer Vision
or Other fields).
Course Policies Cont.
Grading
Your grade will determined based on a midterm and final exams,
paper presentation, participation in class discussion, and project
reports and demo. Note that good programming skills and a lot of
motivation are essential in order for someone to complete the
project successfully. Programming can be done using any
language. A number of neural networks software packages will be
available to assist you in the implementation phase of your project
Grading Scheme: Midterm Exam: 15% (or few Quizzes)
Final Exam: 25%
Participation in class discussion: 10%
Project: 50%
- Interim report: 5%
- Final report: 20%,
- Implementation, Experiments, Demo: 15%
- Presentation: 10%
What Will Not Be Covered
Review of all architectures and learning rules
Implementation
VLSI
Optical
Parallel Computers
Biology (in deep)
Psychology
Introduction
As you read these words you are using a complex biological neural network. You have a highly interconnected set of some 1011 neurons to facilitate your reading, breathing, motion and thinking.
Historical Sketch
Pre-1940: von Hemholtz, Mach, Pavlov, etc. General theories of learning, vision, conditioning No specific mathematical models of neuron operation
1940s: Hebb, McCulloch and Pitts Mechanism for learning in biological neurons Neural-like networks can compute any arithmetic function
1950s: Rosenblatt, Widrow and Hoff First practical networks and learning rules
1960s: Minsky and Papert Demonstrated limitations of existing neural networks, new
learning algorithms are not forthcoming, some research suspended
1970s: Amari, Anderson, Fukushima, Grossberg, Kohonen Progress continues, although at a slower pace
1980s: Grossberg, Hopfield, Kohonen, Rumelhart, etc. Important new developments cause a resurgence in the field
Applications
Aerospace High performance aircraft autopilots, flight path simulations,
aircraft control systems, autopilot enhancements, aircraft component simulations, aircraft component fault detectors
Automotive Automobile automatic guidance systems, warranty activity
analyzers
Banking Check and other document readers, credit application evaluators
Defense Weapon steering, target tracking, object discrimination, facial
recognition, new kinds of sensors, sonar, radar and image signal processing including data compression, feature extraction and noise suppression, signal/image identification
Electronics Code sequence prediction, integrated circuit chip layout, process
control, chip failure analysis, machine vision, voice synthesis, nonlinear modeling
Applications
Financial Real estate appraisal, loan advisor, mortgage screening,
corporate bond rating, credit line use analysis, portfolio trading program, corporate financial analysis, currency price prediction
Manufacturing Manufacturing process control, product design and analysis,
process and machine diagnosis, real-time particle identification, visual quality inspection systems, beer testing, welding quality analysis, paper quality prediction, computer chip quality analysis, analysis of grinding operations, chemical product design analysis, machine maintenance analysis, project bidding, planning and management, dynamic modeling of chemical process systems
Medical Breast cancer cell analysis, EEG and ECG analysis, prosthesis
design, optimization of transplant times, hospital expense reduction, hospital quality improvement, emergency room test advisement
Applications
Robotics Trajectory control, forklift robot, manipulator controllers, vision
systems
Speech Speech recognition, speech compression, vowel classification, text
to speech synthesis
Securities Market analysis, automatic bond rating, stock trading advisory
systems
Telecommunications Image and data compression, automated information services,
real-time translation of spoken language, customer payment processing systems
Transportation Truck brake diagnosis systems, vehicle scheduling, routing
systems
Biology
• Neurons respond slowly
– 10-3 s compared to 10-9 s for electrical circuits
• The brain uses massively parallel computation
– 1011 neurons in the brain
– 104 connections per neuron