Neural Network Toolbox | Search  Help Desk |
srchcha | Examples See Also |
One-dimensional minimization using the method of Charalambous
[a,gX,perf,retcode,delta,tol] = srchcha(net,X,Pd,Tl,Ai,Q,TS,dX,gX,perf,dperf,delta,tol,ch_perf)
srchcha
is a linear search routine. It searches in a given direction to locate the minimum of the performance function in that direction. It uses a technique based on the method of Charalambous.
srchcha(net,X,Pd,Tl,Ai,Q,TS,dX,gX,perf,dperf,delta,tol,ch_perf)
takes these inputs,
X -
Vector containing current values of weights and biases.
Ai -
Initial input delay conditions.
perf -
Performance value at current X
.
dperf -
Slope of performance value at current X
in direction of dX
.
ch_perf -
Change in performance on previous step.
A -
Step size which minimizes performance.
gX -
Gradient at new minimum point.
perf -
Performance value at new minimum point.
retcode -
Return code, which has three elements. The first two elements correspond to the number of function evaluations in the two stages of the search. The third element is a return code. These will have different meanings for different search algorithms. Some may not be used in this function.
delta -
New initial step size. Based on the current step size.
TOL -
New tolerance on search.
alpha -
Scale factor which determines sufficient reduction in perf
.
beta -
Scale factor which determines sufficiently large step size.
gama -
Parameter to avoid small reductions in performance. Usually set to 0.1.
scale_tol -
Parameter which relates the tolerance tol
to the initial step size delta. Usually set to 20.
traincgf
,
traincgb
,
traincgp
,
trainbfg
,
trainoss
.
Dimensions for these variables are:
Pd - No
x Ni
x TS
cell array, each element P{i,j,ts}
is a Dij
x Q
matrix.
Tl - Nl
x TS
cell array, each element P{i,ts}
is an Vi
x Q
matrix.
Ai - Nl
x LD
cell array, each element Ai{i,k}
is an Si
x Q
matrix.
Dij = Ri * length(net.inputWeights{i,j}.delays)
P
and targets T
that we would like to solve with a network.
P = [0 1 2 3 4 5]; T = [0 0 0 1 1 1];Here a two-layer feed-forward network is created. The network's input ranges from
[0 to 10]
. The first layer has two tansig neurons, and the second layer has one logsig neuron. The traincgf
network training function and the srchcha
search function are to be used.
Create and Test a Network
net = newff([0 5],[2 1],{'tansig','logsig'},'traincgf'); a = sim(net,p)Train and Retest the Network
net.trainParam.searchFcn = 'srchcha
';
net.trainParam.epochs = 50;
net.trainParam.show = 10;
net.trainParam.goal = 0.1;
net = train(net,p,t);
a = sim(net,p)
You can create a standard network that uses srchcha
with newff
, newcf
, or newelm
.
To prepare a custom network to be trained with traincgf
, using the line search function srchcha
:
.net.trainFcn
to 'traincgf
'. This will set net.trainParam
to traincgf
's
default parameters.
.net.trainParam.searchFcn
to 'srchcha
'.
srchcha
function can be used with any of the following training functions: traincgf
, traincgb, traincgp, trainbfg, trainoss.
srchcha
locates the minimum of the performance function in the search direction dX
, using an algorithm based on the method described in Charalambous (IEEE Proc. vol. 139, no. 3, June 1992).
srchbac
,
srchbre
,
srchgol
,
srchhyb
Charalambous, C.,"Conjugate gradient algorithm for efficient training of artificial neural networks," IEEE Proceedings, vol. 139, no. 3, pp. 301-310, 1992.