Neural Network Toolbox | Search  Help Desk |
newgrnn | Examples See Also |
Design a generalized regression neural network
net = newgrnn(P,T,spread)
Generalized regression neural networks are a kind of radial basis network that is often used for function approximation. grnn'
s can be designed very quickly.
newgrnn(P,T,spread)
takes three inputs,
P - R
x Q
matrix of Q
input vectors.
T - S
x Q
matrix of Q
target class vectors.
spread
- Spread of radial basis functions, default = 1.0.
spread,
is the smoother the function approximation will be. To fit data very closely, use a spread
smaller than the typical distance between input vectors. To fit the data more smoothly, use a larger spread
.
newgrnn
creates a two layer network. The first layer has radbas
neurons, calculates weighted inputs with dist
and net input with netprod
. The second layer has purelin
neurons, calculates weighted input with normprod
and net inputs with netsum
. Only the first layer has biases.
newgrnn
sets the first layer weights to P
', and the first layer biases are all set to 0.8326/spread,
resulting in radial basis functions that cross 0.5 at weighted inputs of +/- spread
. The second layer weights W2
are set to T
.
Here we design a radial basis network given inputs P
and targets T
.
P = [1 2 3]; T = [2.0 4.1 5.9]; net = newgrnn(P,T);Here the network is simulated for a new input.
P = 1.5; Y = sim(net,P)
sim
,
newrb
,
newrbe
,
newpnn
Wasserman, P.D., Advanced Methods in Neural Computing, New York: Van Nostrand Reinhold, pp. 155-61, 1993.