You are on page 1of 3

The Perceptron

Exercises: Part I

Perceptrons are of interest to us because they learn to classify patterns. Learning involves changing the
weights of the connections in the perceptron in such a way to reduce response errors to (ideally) zero. For
any particular pattern, the response error is simply the desired response to that pattern minus the
perceptron's actual response to that pattern (desired - actual). In general, we are interested in eliminating
response errors to all patterns. Because of this, in this exercise we will be examining the effect of changin
the weight vector w on the sum of squared error for eight different patterns.

1) In this exercise, you will be the "learning rule" for the perceptron. Go to Sheet 1 of the spreadsheet.
Take your mouse and change the position of the endpoint of the weight vector. (If you prefer, go to the
table at the bottom of the spreadsheet and type in different weight coordinates.) Record the weight vector,
and the sum of squared error that it produces in the perceptron. Move the weight vector to a different
position, exploring another part of the pattern space. Use the table below to record your observations:

Pattern
Weight X Value
Weight Y Value
SSE

0
-0.6
0.8
10

2
SSE
10
Pattern Space
2 WT
-0.67

1
Output Error For Each Pattern

0
-2 -1 0 1 2 2

-1 WT 0
A
B -2
A Patterns B Patterns
-2

ACTUAL DESIRED SQUARED


Name X Y NET OUT OUT ERROR ERROR
A1 0.3 0.7 0.422 1 1 0 0
A2 0.4 0.9 0.533 1 1 0 0
A3 0.5 0.5 0.11 1 1 0 0
A4 0.7 0.3 -0.202 -1 1 2 6
B1 -0.6 0.3 0.669 1 -1 -2 2
B2 -0.4 -0.2 0.09 1 -1 -2 2
B3 0.3 -0.4 -0.557 -1 -1 0 0
B4 -0.2 -0.8 -0.578 -1 -1 0 0
ORIGIN 0 0
WT -0.67 0.89 0 10 TOTAL
WT
0.89

Error For Each Pattern

tterns B Patterns

You might also like