You are on page 1of 31

Introduction to Artificial Intelligent

(AI-II)

Halgurd S. Maghdid
Chihan University – 2017-2018
Lecture – 6
Outline

Single and Multilayer NN

Different Activation Functions

Perceptron Neural Network

Single Layer
Multi Layer
Single layer NN

Input layer Output layer


of of
source nodes neurons

3 of 32
Multi layer NN

● FFNN is a more general network architecture, where there are


hidden layers between input and output layers.
● Hidden nodes do not directly receive inputs nor send outputs to
the external environment.
● FFNNs overcome the limitation of single-layer NN.
● They can handle non-linearly separable learning tasks.

Input Output
layer layer

Hidden Layer
4 of 32
3-4-2 Network
Activation Functions

(Activation) Functions
The Activate function translates the
input signals to output signals. Four
types of transfer functions are
commonly used, Unit step (threshold),
sigmoid, piecewise linear, and
Gaussian.
Unit step (threshold)
The output is set at one of two levels,
depending on whether the total input
is greater than or less than some
threshold value.

5 of 32
Activation Functions

Transfer (Activation)
Functions
Sigmoid
The sigmoid function
consists of 2
functions, logistic and tan
gential. The values of
logistic function range
from 0 and 1 and -1 to +1
for tangential function.

6 of 32
Activation Functions

Piecewise Linear
The output is proportional to the total weighted output.

7 of 32
Activation Functions
Gaussian
Gaussian functions are bell-shaped curves that are continuous.
The node output (high/low) is interpreted in terms of class
membership (1/0), depending on how close the net input is to a
chosen value of average.

8 of 32
OR Gate Example

A
w1 Output

B w2

Initial weights are = 0.3


Initial weights are = 0.2
Initial weights are = - 0.1
Initial weights are = 0.1
Activate function is Unit Step

9 of 32
Perceptron Neural Network

10 of 32
History of Perceptron Model

In 1957, Rosenblatt and several other researchers


developed perceptron, which used the similar
network as proposed by McCulloch, and the learning
rule for training network to solve pattern recognition
problem.

(*) But, this model was later criticized by Minsky


who proved that it cannot solve the XOR problem.

11 of 32
Structure

The network structure includes:


Input layer: input variables with binary type information.
The number of node depends on the problem dimension.
Processing node: uses linear activation function, i.e.,
net j  I j , and the Bias  j is used.
n

Output layer: the computed results is


generated through transfer function.
Transfer Function: discrete type,
i.e., step function.
Perceptron Network Structure

W11
X1 f1 W13
W12
f3 Y1
W21
X2 f2 W23
W22

13 of 32
The training process
The training steps: (One layer at a time)
1. Choose the network layer, nodes, and connections.
2. Randomly assign weights: Wij & bias:  j
3. Input training sets Xi (preparing Tj for verification )
4. Training computation:

net j  Wij X i   j
i

1 net j > 0
Yj= if
0 net j 0

14 of 32
The training process
5. Training computation:
If T j  Y j   0 then:

Wij   T j  Y j X i where Tj is the learning rate!
 j   Tj  Yj  where  is the learning rate!

Update weights and bias :


Wij  Wij  Wij
new j   j   j

6. repeat steps 3 ~step 5 until every input pattern is satisfied as:


T j  Y j   0

15 of 32
The recall process

• After the network has trained as mentioned above,


any input vector X can be send into the Perceptron
network to derive the computed output. The ratio of
total number of corrected output is treated as the
prediction performance of the network.

• The trained weights, Wij, and the bias, θj , is used to


derive netj and, therefore, the output Yj can be
obtained for pattern recognition(or for prediction).

16 of 32
17 of 32
18 of 32
19 of 32
20 of 32
21 of 32
Example: Solving the AND problem
•This is a problem for recognizing the AND pattern
•Let the training patterns are used as follow

X1 X2 T X2
0 0 0
0 1 0
f1
1 0 0
1 1 1
X1

22 of 32
23 of 32
24 of 32
25 of 32
26 of 32
27 of 32
28 of 32
29 of 32
30 of 32
31 of 32

You might also like