Professional Documents
Culture Documents
working
Applications:
Pattern recognition
Data classification
processing
in union to
impossible
computers
by
human
brain
and
NN vs. Conventional
Computers
Conventional Computers
NN vs. Conventional
Computers
Conventional Computers
NN vs. Conventional
Computers
Neural Networks
They process information in the same way
as the human brain
NN learn through example
They can be programmed to perform any
task
Examples used to train the NN has to be
selected carefully
NN vs. Conventional
Computers
Disadvantage
NN finds out the solution for a
problem by itself,
hence the results are unpredictable
NN vs. Conventional
Computers
NN and conventional computers are
not in competition they are
complement to each other
Combination of these two approaches
can perform at a maximum efficiency
(Conventional computers supervise NN)
Dendrites
Axon
Synapse
Human Neurons to
Artificial Neurons
Simple Neuron
ANN has many inputs and one output
Two modes of operation
Training mode Trained to fire or not
for particular e.g.s
User mode for a taught input gives
the associated output, else determines
the output based on
Firing
Rule.
Simple Neuron
Firing Rules
Firing rule determines whether the
neuron should fire or not fire
for any input pattern
Relates to all input pattern, which is
not used in the training
Firing rule is based on Hamming
Distance
Example
3-input neuron (X1,X2,X3)
Output is 1 for 111 and 101
Output is 0 for 000 and 001
Truth table Before applying firing rule
Firing Rule
Input pattern is 010
Differs from 000 by 1 element
001 by 2 elements
101 by 3 elements
Nearest pattern is 000 which belongs to
0
Hence output for the pattern 010 will be
0
Input pattern is 011
Equally distant from the two taught patterns
Hence output is undefined
Pattern Recognition
One of the most important application
of NN is pattern recognition
Example
The network is trained to recognize
the patterns
T - all black
H - all white
Middle
neuron
Bottom
neuron
Examples of pattern
recognition using the truth
table
Close to T
Close to H
Close to H & T
Complicated Neuron
McCulloch and Pitts model (MCP).
The inputs are weighted, based on the
effect that each input has on decision
making
Weight is a number which is multiplied with
the input to give Weighted Input
Weighted inputs are added and it exceeds
a threshold value the neuron fires. In other
case it does not fire
Complicated Neuron
Architectures of Neural
Networks
Feed Forward Networks
Feed Backward Networks
Feed Backward
Networks
by
Feed Backward
Networks
using
two
Weight Matrix
Every
neural
network
possess
knowledge which is contained in the
values of the connections weights
Two
major
categories
of
neural
networks, based on the Weight matrix
Fixed networks in which the weights
cannot be changed, ie dW/dt=0
Adaptive networks which are able to
change their weights, ie dW/dt not= 0.
Transfer Function
The behaviour of an ANN (Artificial
Neural Network) depends
Weights
Input-Output
function
(transfer
function)
Three categories of Transfer Function
Linear (or ramp)
Threshold
Sigmoid
Transfer Function
Linear units, the output activity is
proportional to the total weighted output.
Threshold units, the output depends on
whether the total input is greater than or
less than some threshold value.
Sigmoid units, the output varies
continuously but not linearly as the input
changes.
Applications of Neural
Networks
Neural networks in medicine
Modelling and Diagnosing
Cardiovascular System
Electronic noses
Instant Physician
the