Nguyen-Widrow layer initialization function
Syntax
net = initnw(net,i)
Description
initnw
is a layer initialization function which initializes a layer's weights and biases according to the Nguyen-Widrow initialization algorithm. This algorithm chooses values in order to distribute the active region of each neuron in the layer evenly across the layer's input space.
initnw(net,i)
takes two arguments,
net -
Neural network.
i -
Index of a layer.
and returns the network with layer i
's weights and biases updated.
Network Use
You can create a standard network that uses initnw
by calling newff
or newcf
.
To prepare a custom network to be initialized with initnw
:
- 1
. - Set
net
.initFcn
to 'initlay
'. (This will set net
.initParam
to the empty
matrix [ ] since initlay
has no initialization parameters.)
- 2
. - Set
net
.layers{i}.initFcn
to 'initnw
'.
To initialize the network call init
. See newff
and newcf
for training examples.
Algorithm
The Nguyen-Widrow method generates initial weight and bias values for a layer, so that the active regions of the layer's neurons will be distributed roughly evenly over the input space.
Advantages over purely random weights and biases are:
- 1
. - Few neurons are wasted (since all the neurons are in the input space).
- 2
. - Training works faster (since each area of the input space has neurons). The
Nguyen-Widrow method can only be applied to layers...
If these conditions are not met then initnw
uses rands
to initialize the layer's weights and biases.
See Also
initwb
,
initlay
,
init
[ Previous | Help Desk | Next ]