Given the following data:
| X1 | X2 | Class |
|---|---|---|
| 2 | 2 | 1 |
| 3 | 3 | 1 |
| 2 | 0 | 1 |
| 3 | 1 | 1 |
| 1 | 1 | 1 |
| 0 | 2 | 2 |
| -1 | 2 | 2 |
| -2 | 0 | 2 |
| -1 | -1 | 2 |
| -2 | 1 | 2 |
1. Using the Perceptron Rule, &rhok = 1/k and an initial weight vector of w=[0,3,2]T, find a linear decision function for these classes. At each step, diagram the discriminant function against the data.
2. For the same data and initial weight vector, use the LMS rule to find the discriminant function. Diagram the intermediate vectors. For &rhok, use 1/(k+1), although that may diverge. Be prepared to adjust &rho if needed. So, you should probably write a small program or script to try different values of &rho.