Neural Network Toolbox | Search  Help Desk |
mandist | Examples See Also |
Manhattan distance weight function
Z = mandist(W,P)
df = mandist('deriv')
D = mandist(pos);
mandist
is the Manhattan distance weight function. Weight functions apply weights to an input to get weighted inputs.
mandist(W,P)
takes these inputs,
and returns the S
x Q
matrix of vector distances.
mandist('deriv')
returns ''
because mandist
does not have a derivative function.
mandist
is also a layer distance function which can be used to find the distances between neurons in a layer.
mandist
(pos
) takes one argument,
pos -
An S
row matrix of neuron positions.
S
x S
matrix of distances.
Here we define a random weight matrix W
and input vector P
and calculate the corresponding weighted input Z
.
W = rand(4,3); P = rand(3,1); Z = mandist(W,P)Here we define a random matrix of positions for 10 neurons arranged in three dimensional space and then find their distances.
pos = rand(3,10); D = mandist(pos)You can create a standard network that uses
mandist
as a distance function by calling newsom
.
To change a network so an input weight uses mandist,
set net.inputWeight{i,j}.weightFcn
to 'mandist'
. For a layer weight set net.inputWeight{i,j}.weightFcn
to 'mandist
'.
To change a network so a layer's topology uses mandist,
set net.layers{i}.distanceFcn
to 'mandist
'.
In either case, call sim
to simulate the network with dist
. See newpnn
or newgrnn
for simulation examples.
The Manhattan distance D
between two vectors X
and Y
is:
D = sum(abs(x-y))
sim
,
dist
,
linkdist