NeuralNetworkMATLABtoolbox
-
Upload
teguh-hady-a -
Category
Documents
-
view
223 -
download
0
Transcript of NeuralNetworkMATLABtoolbox
-
8/6/2019 NeuralNetworkMATLABtoolbox
1/21
-
8/6/2019 NeuralNetworkMATLABtoolbox
2/21
April 2007 2
NEURAL NETWORK TOOLBOXNEURAL NETWORK TOOLBOXThe Matlab neural network toolbox provides a
complete set of functions and a graphical user interfacefor the design, implementation, visualization, andsimulation of neural networks.
It supports the most commonly used supervised andunsupervised network architectures and a comprehensive
set of training and learning functions.
-
8/6/2019 NeuralNetworkMATLABtoolbox
3/21
April 2007 3
KEY FEATURESKEY FEATURES
G raphical user interface ( GU I) for creating, training, andsimulating your neural networks.
S upport for the most commonly used supervised andunsupervised network architectures.
A comprehensive set of training and learning functions.A suite of S imulink blocks, as well as documentation anddemonstrations of control system applications.
A utomatic generation of S imulink models from neuralnetwork objects.
R outines for improving generalization.
-
8/6/2019 NeuralNetworkMATLABtoolbox
4/21
April 2007 4
G ENERAL CREATION OF NETWORKG ENERAL CREATION OF NETWORK
net = network
net=network(numInputs,numLayers,biasConnect,inputConnect,layerConnect,outputConnect,targetConnect)
Description
NETWORK
creates new custom networks. It is used tocreate networks that are then customized by functionssuch as NEWP, NEWLIN, NEWFF, etc.
-
8/6/2019 NeuralNetworkMATLABtoolbox
5/21
April 2007 5
NETWORK takes these optional arguments (shown with
d efault values):
numInputs - N umber of inputs, 0.
numLayers - N umber of layers, 0.
biasConnect - numLayers -by -1 B oolean vector, zeros .
inputConnect - numLayers -by -numInputs Boolean matrix, zeros .
layerConnect - numLayers -by -numLayers Boolean matrix, zeros .
outputConnect - 1- by -numLayers Boolean vector, zeros .
targetConnect - 1- by -numLayers Boolean vector, zeros, an dreturns,
NET - N ew network with the given property values .
-
8/6/2019 NeuralNetworkMATLABtoolbox
6/21
April 2007 6
TRAIN AND ADAPTTRAIN AND ADAPT
1 . Incremental training : updating the weights after the presentationof each single training sample.
2. Batch training : updating the weights after each presenting thecomplete data set.
When using ad apt , both incremental and batch training can be used . When using train on the other hand, only batchtraining will be used, regardless of the format of the data. The
big plus of train is that it gives you a lot more choice in training
functions (gradient descent, gradient descent w/ momentum,Levenberg-Marquardt, etc.) which are implemented veryefficiently .
-
8/6/2019 NeuralNetworkMATLABtoolbox
7/21
April 2007 7
The difference between train and ad apt : the differencebetween passes and epochs . When using ad apt , the property thatdetermines how many times the complete training data set is usedfor training the network is called net .ad aptParam .passes . Fair enough. But, when using train , the exact same property is nowcalled net .trainParam .epochs .
>> net.trainFcn = 'traingdm';
>> net.trainParam.epochs = 1000;
>> net.adaptFcn = 'adaptwb';>> net.adaptParam.passes = 10;
-
8/6/2019 NeuralNetworkMATLABtoolbox
8/21
April 2007 8
TRAININ G FUNCTIONSTRAININ G FUNCTIONS
There are several types of training functions:1 . S upported training functions,
2. S upported learning functions,
3. Transfer functions,
4. Transfer derivative functions,
5. Weight and bias initialize functions,
6. Weight derivative functions.
-
8/6/2019 NeuralNetworkMATLABtoolbox
9/21
-
8/6/2019 NeuralNetworkMATLABtoolbox
10/21
April 2007 10
SUPPORTED LEARNIN G FUNCTIONSSUPPORTED LEARNIN G FUNCTIONS
learncon Conscience bias learning functionlearng d G radient descent weight/bias learning functionlearng d m G radient descent with momentum weight/bias learningfunctionlearnh H ebb weight learning functionlearnh d H ebb with decay weight learning rulelearnis Instar weight learning functionlearnk K ohonen weight learning functionlearnl v1 LVQ 1 weight learning functionlearnl v2 LVQ2 weight learning functionlearnos Outstar weight learning functionlearnp Perceptron weight and bias learning functionlearnpn Normalized perceptron weight and bias learning functionlearnsom S elf-organizing map weight learning functionlearnwh Widrow- H off weight and bias learning rule
-
8/6/2019 NeuralNetworkMATLABtoolbox
11/21
April 2007 11
TRANSFER FUNCTIONSTRANSFER FUNCTIONS
compet - Competitive transfer function.
har d lim - H ard limit transfer function.
har d lims - S ymmetric hard limit transfer function.
logsig - Log sigmoid transfer function.
poslin - Positive linear transfer function.
purelin - Linear transfer function.
ra d bas - R adial basis transfer function.
satlin - S aturating linear transfer function.
satlins - S ymmetric saturating linear transfer function.softmax - S oft max transfer function.
tansig - H yperbolic tangent sigmoid transfer function.
tribas - Triangular basis transfer function.
-
8/6/2019 NeuralNetworkMATLABtoolbox
12/21
April 2007 12
TRANSFER DERIVATIVE FUNCTIONSTRANSFER DERIVATIVE FUNCTIONS
d har d lim - H ard limit transfer derivative function.
d har d lms - S ymmetric hard limit transfer derivative function
d logsig - Log sigmoid transfer derivative function.
d poslin - Positive linear transfer derivative function.
d purelin - H ard limit transfer derivative function.
d ra d bas - R adial basis transfer derivative function.
d satlin - S aturating linear transfer derivative function.
d satlins - S ymmetric saturating linear transfer derivative function.
d tansig - H yperbolic tangent sigmoid transfer derivative function.
d tribas - Triangular basis transfer derivative function.
-
8/6/2019 NeuralNetworkMATLABtoolbox
13/21
April 2007 13
W EIGHT AND BIAS INITIALIZATION W EIGHT AND BIAS INITIALIZATION
FUNCTIONSFUNCTIONS
initcon - Conscience bias initialization function.
init zero - Zero weight/bias initialization function.
mi d point - Midpoint weight initialization function.
ran d nc - Normalized column weight initialization function.
ran d nr - Normalized row weight initialization function.
ran d s - S ymmetric random weight/bias initialization function.
W EIGHT DERIVATIVE FUNCTIONS W EIGHT DERIVATIVE FUNCTIONS
dd otpro d - Dot product weight derivative function.
-
8/6/2019 NeuralNetworkMATLABtoolbox
14/21
April 2007 14
NEURAL NET W ORK TOOLBOX GUINEURAL NET W ORK TOOLBOX GUI
1 . The graphical user interface ( GU I) is designed to be simple and user friendly. This tool lets you import potentially large and complex datasets.
2. The GU I also enables you to create, initialize, train, simulate, and
manage the networks. It has theGU
I Network/Data Manager window.3. The window has its own work area, separate from the more familiar
command line workspace. Thus, when using the GU I, one might"export" the GU I results to the (command line) workspace. S imilarly to"import" results from the command line workspace to the GU I.
4. Once the Network/Data Manager is up and running, create a network,view it, train it, simulate it and export the final results to the workspace.S imilarly, import data from the workspace for use in the GU I.
-
8/6/2019 NeuralNetworkMATLABtoolbox
15/21
April 2007 15
-
8/6/2019 NeuralNetworkMATLABtoolbox
16/21
April 2007 16
-
8/6/2019 NeuralNetworkMATLABtoolbox
17/21
April 2007 17
-
8/6/2019 NeuralNetworkMATLABtoolbox
18/21
April 2007 18
-
8/6/2019 NeuralNetworkMATLABtoolbox
19/21
April 2007 19
-
8/6/2019 NeuralNetworkMATLABtoolbox
20/21
April 2007 20
A graphical user interface can thus be used to1 . Create network,
2. Create data,3. Train the networks,
4. Export the networks,
5. Export the data to the command line workspace.
-
8/6/2019 NeuralNetworkMATLABtoolbox
21/21
April 2007 21
CONCLUSIONCONCLUSION
The presentation has given an overview of Neural Network tool box in M A TL A B.