Professional Documents
Culture Documents
NEURAL NETWORKS
ABSTRACT:
Neural networks have become one of the most popular models for distributed
computation, in particular, and distributed processes in general. They are used for a
diversity of purposes and are especially promising for artificial intelligence. Neural
networks are ideally suited to describe the spatial and temporal dependence of tracer-
tracer correlations. The neural network performs well even in regions where the
correlations are less compact and normally a family of correlation curves would be
required.
.DEFINITION:
1
A
SIMPLE
NEURAL NETWORK
An artificial neuron is a device with many inputs and one output. The neuron has two
modes of operation; the training mode and the using mode. In the training mode, the
neuron can be trained to fire (or not), for particular input patterns. In the using mode,
when a taught input pattern is detected at the input, its associated output becomes the
current output. If the input pattern does not belong in the taught list of input patterns, the
firing rule is used to determine whether to fire or not.
A simple neuron
2
HUMAN AND ARTIFICIAL NEURONS:
In the human brain, a typical neuron collects signals from others through a host of
fine structures called dendrites.
The neuron sends out spikes of electrical activity through a long, thin stand
known as an axon, which splits into thousands of branches.
At the end of each branch, a structure called a synapse converts the activity from
the axon into electrical effects that inhibit or excite activity from the axon into
electrical effects that inhibit or excite activity in the connected neurons.
When a neuron receives excitatory input that is sufficiently large compared with
its inhibitory input, it sends a spike of electrical activity down its axon.
Learning occurs by changing the effectiveness of the synapses so that the
influence of one neuron on another changes.
Components of a neuron
3
HISTORICAL DEVELOPMENT:
Neural network simulations appear to be a recent development. However, this field was
established before the advent of computers, and has survived at least one major setback
and several eras. Neural networks are also similar to the biological neural networks in the
sense that functions are performed collectively and in parallel by the units, rather than
there being a clear delineation of subtasks to which various units are assigned.
4
No learning for MLP
Component
based
representation
of a neural
network. In a
neural
network
model, simple nodes (called variously , "PEs" ("processing elements") or "units") are
connected together to form a network of nodes — hence the term "neural
network”.Neural networks involves a network of simple processing elements (neurons)
which can exhibit complex global behavior, determined by the connections between the
processing elements and element parameters.
5
TYPES:
In this network, the information moves in only one direction, forward, from
the input nodes, through the hidden nodes and to the output nodes.
There are no cycles or loops in the network.
3. Recurrent network :
RNs propagate data from later processing stages to earlier stages.Recurrent networks
are classified into three different types.
6
A RBF is a function which has built into a distance criterion with respect to a
centre.
RBF networks have two layers of processing:
Input is mapped onto each RBF in the 'hidden' layer.
In regression problems the output layer is then a linear combination of hidden
layer values representing mean predicted output.
In classification problems the output layer is typically a sigmoid function of a
linear combination of hidden layer values, representing a posterior probability.
Performance in both cases is often improved by shrinkage techniques, known as
ridge regression .
The input layer consists of n units which represents the elements of the vector x. The k
components of the sum in the definition of f are represented by the units of the hidden
layer. The single output neuron gets its input from all hidden neurons.
7
Working Principle:
The principle of radial basis functions derives from the theory of functional
ε
approximation. Given N pairs (xi, yi) (x ε Rn, , y R), we are looking for a function f of
the form:
h is the radial basis function (normally a Gaussian function) and t i are the k centers which
have to be selected. The coefficient ci are also unknown at the moment and have to be
computed. xi and ti are elements of an n – dimensional vector space.
The RBF network has the Gaussian function as the activation function of each hidden
unit and the output layer performs a linear combination of the localized bumps and is thus
able to approximate any function.
When the RBF network is used in classification, the hidden layer performs clustering
while the output layer performs classification. The hidden units would have the strongest
impulse when the input patterns are closed to the centers of the hidden units and
gradually weaker impulse as the input patterns moved away from the centers. The output
layer would linearly combine all the outputs of the hidden layer. Each output node would
then give an output value which represents the probability that the input pattern falls
under that class.
These networks train extremely fast and require fewer training samples
Faster than training the network.
Saves significant time compared to retraining from scratch.
Savings in space and possibly in computation.
8
Disadvantages of Radial Basis Function Network :
Tuning the various numbers of parameters – radius, centers etc, can get quite
complicated.
Choosing the right centers (for the hidden layer) is of critical importance.
Training the supervised layer of the network using gradient descent faced the
problem of being stuck in a local minimum.
Radial basis functions are used in the area of neural networks where they may be
used as a replacement for the sigmoidal hidden layer transfer characteristic in
multi-layer perceptrons.
Also used for pattern classification
Neural networks are well suited for prediction or forecasting needs including:
sales forecasting
industrial process control
customer research
data validation
risk management
target marketing
PATTERN RECOGNITION:
MEDICINE:
9
Instant physician: The net is presented with an input consisting of a set of
symptoms; it will then find the full stored pattern that represents the "best"
diagnosis and treatment.
BUSINESS:
There is a strong potential for using neural networks for database mining that is,
searching for patterns implicit within the explicitly stored information in databases.
Credit evaluation: The HNC neural systems were applied to mortgage screening.
CONTRIBUTION:
The computing world has a lot to gain from neural networks. Their ability to learn by
example makes them very flexible and powerful. Furthermore there is no need to devise
an algorithm in order to perform a specific task; i.e. there is no need to understand the
internal mechanisms of that task. They are also very well suited for real time systems
because of their fast response and computational times which are due to their parallel
architecture.
Neural networks also contribute to other areas of research such as neurology and
psychology. They are regularly used to model parts of living organisms and to investigate
the internal mechanisms of the brain.
Perhaps the most exciting aspect of neural networks is the possibility that some day
'conscious' networks might be produced. There are a number of scientists arguing that
consciousness is a 'mechanical' property and that 'conscious' neural networks are a
realistic possibility.
CONCLUSION:
Finally, we would like to state that even though neural networks have a huge potential we
will only get the best of them when they are integrated with computing, AI, fuzzy logic
and related subjects.
10
11