Neural Network Toolbox
  Go to function:
    Search    Help Desk 
newp    Examples   See Also

Create a perceptron

Syntax

net = newp(pr,s,tf,lf)

Description

Perceptrons are used to solve simple (i.e. linearly separable) classification problems.

net = newp(PR,S,TF,LF) takes these inputs,

and returns a new perceptron.

The transfer function TF can be hardlim or hardlims. The learning function LF can be learnp or learnpn.

Call newp without input arguments to define the network's attributes in a dialog window.

Properties

Perceptrons consist of a single layer with the dotprod weight function, the netsum net input function, and the specified transfer function.

The layer has a weight from the input and a bias.

Weights and biases are initialized with initzero.

Adaption and training are done with adaptwb and trainwb, which both update weight and bias values with the specified learning function. Performance is measured with mae.

Examples

This code creates a perceptron layer with one 2-element input (ranges [0 1] and [-2 2]) and one neuron. (Supplying only two arguments to newp results in the default perceptron learning function learnp being used.)

Here we simulate the network to a sequence of inputs P.

Here we define a sequence of targets T (together P and T define the operation of an AND gate), and then let the network adapt for 10 passes through the sequence. We then simulate the updated network.

Now we define a new problem, an OR gate, with batch inputs P and targets T.

Here we initialize the perceptron (resulting in new random weight and bias values), simulate its output, train for a maximum of 20 epochs, and then simulate it again.

Notes

Perceptrons can classify linearly separable classes in a finite amount of time. If input vectors have a large variance in their lengths, the learnpn can be faster than learnp.

See Also

sim, init, adapt, train, hardlim, hardlims, learnp, learnpn



[ Previous | Help Desk | Next ]