Professional Documents
Culture Documents
Asrar Ahmad
405 IMBA 9th
Table of Contents
Introduction ............................................................................................................................................ 3
1.1 The Brain: .......................................................................................................................................... 3
1.2 The Artificial Neural Network: ...................................................................................................... 4
1.3 Benefits of ANN:............................................................................................................................ 5
1.3 Working of a Neuron:.................................................................................................................... 7
2. Learning process ................................................................................................................................. 8
2.1 Learning......................................................................................................................................... 8
2.2 Paradigms of learning ................................................................................................................... 9
2.2.1 Error-correction learning ..................................................................................................... 10
2.2.2 Hebbian learning ................................................................................................................. 11
2.3
APPLICATIONS ............................................................................................................................... 24
4.1 IN GEOTECHNICAL ENGINEERING ............................................................................................... 24
4.2 IN MEDICINE................................................................................................................................ 25
4.3 IN BUSINESS ................................................................................................................................ 26
Introduction
1.1 The Brain:
The brain is a highly complex, nonlinear, and parallel information-processing
system. It has the capability of organizing neurons so as to perform certain
computations (e.g. pattern recognition, perception, and motor control) many
times faster than the fastest digital computer. For another example, consider the
sonar of a bat. Sonar is an active echo-location system. In addition to providing
information about how far away a target (e.g. a flying insect) is bat sonar
conveys information about the relative velocity of the target, the size of the
target, the size of various features of the target, and the azimuth and elevation of
the target. The complex neural computations needed to extract all this
information from the target echo occur within a brain the size of a plum.
How, then, does the brain of a bat do it? At birth, a brain has great structure and
the ability to build up its own rules through what we usually refer to as
experience.
Indeed, experience is built up over the years, with the most dramatic
development i.e. hard-wiring of the human brain taking place in the first two
years from birth; but the development continues well beyond that stage. During
this early stage of development, about one million synapses are formed per
second.
Synapses are elementary structural and functional units that mediate the
interactions between neurons. In traditional descriptions of neural organization,
it is assumed that a synapse is a simple connection that can impose excitation or
inhibition on the receptive neuron.
(Hajek 2005)
yk =(vk )
2. Learning process
2.1 Learning
Among the many interesting properties of a neural network is the ability of the
network to learn from its environment and to improve its performance through
learning. A neural network learns about its environment through an iterative
process of adjustments applied to its synaptic weights and thresholds.
We define learning in the context of neural networks as follows:
Learning is a process by which the free parameters of a neural network are
adapted through a continuing process of stimulation by the environment in
which the network is embedded. The type of learning is determined by the
manner in which the parameter changes take place.
(Hajek 2005)
This definition of the learning process implies the following sequence of events:
The neural network is stimulated by an environment.
The neural network undergoes changes as a result of this stimulation
The neural network responds in a new way to the environment, because
of the changes that have occurred in its internal structure.
Let wkj(n) denote the value of the synaptic weight wkj at time n. At time n an
adjustment wkj (n) is applied to the synaptic weight wkj (n), yielding the
updated value:
wkj (n+1) = wkj (n) + wkj (n)
A prescribed set of well-defined rules for the solution of a learning problem is
called a learning algorithm. As one would expect, there is no unique learning
algorithm for the design of neural networks. Rather, we have a kit of tools
represented by a diverse variety of learning algorithms, each of which offers
advantages of its own. Basically, learning algorithms differ from each other in
the way in which the adjustment wkj to the synaptic weight wkj is formulated.
A prescribed set of well-defined rules for the solution of a learning problem is
called a learning algorithm. As one would expect, there is no unique learning
algorithm for the design of neural networks. Rather, we have a kit of tools
represented by a diverse variety of learning algorithms, each of which offers
advantages of its own. Basically, learning algorithms differ from each other in
the way in which the adjustment wkj to the synaptic weight wkj is formulated.
(Hajek 2005)
Unsupervised learning or self-organization in which an (output) unit is
trained to respond to clusters of pattern within the input. In this paradigm
the system is supposed to discover statistically salient features of the
input population. Unlike the supervised learning paradigm, there is no a
priori set of categories into which the patterns are to be classified; rather
the system must develop its own representation of the input stimuli.
Once the network has become tuned to the statistical regularities of the
input data, it develops the ability to form internal representations for
encoding features of the input and thereby creates new classes
automatically.
(Hajek 2005)
Wkj(n)
Xj(n)
Yk(n)
(Hajek 2005)
(Hajek 2005)
(Zurada 1992)
The output unit of a perceptron is a linear threshold element. (Rosenblatt, 1959)
proved the remarkable theorem about perceptron learning and in the early 60s
perceptrons created a great deal of interest and optimism.
The initial euphoria was replaced by disillusion after the publication of Minsky
and Papert's (Papert and Minsky 1969). In this book they analyzed the
perceptron thoroughly and proved that there are severe restrictions on what
perceptrons can represent. One of Minsky and Papert's most discouraging
results shows that a single layer perceptron cannot represent a simple exclusiveor function
3.2
Back-propagation networks
This feed-forward
back-propagation
neural network is
fully connected,
which means that a
neuron in any layer is
connected to all
neurons in the
previous layer. Signal
flow through the
network progresses in
a forward direction,
from left to right and
on a layer-by layer
basis.
(Hajek 2005)
(Zurada
1992)
(Zurada 1992)
(Kohonen 1982-90)
4. APPLICATIONS
Neural networks have broad applicability to real world business problems. In
fact, they have already been successfully applied in many industries.
4.1 IN GEOTECHNICAL ENGINEERING
In the field of geotechnical engineering, we encounter some types of problems
that are very complex. ANNs provide several advantages over more
conventional computing techniques. For most traditional mathematical models,
the lack of physical understanding is usually supplemented by either simplifying
the problem or incorporating several assumptions into the models. Mathematical
models also rely on assuming the structure of the model in advance, which may
be suboptimal. Thats why many mathematical models fail to simulate the
complex behavior of most geotechnical engineering problems.
In contrast, ANNs are a data driven approach in which the model can be trained
on input-output data pairs to determine the structure and parameters of the
model. In this case, there is no need to either simplify the problem or
incorporate any assumptions. Moreover, ANNs can always be updated to obtain
better results by presenting new training examples as new data become
available. These factors combine to make ANNs a powerful modeling tool in
geotechnical engineering.
4.2 IN MEDICINE
Neural networks are ideal in recognizing diseases using scans since there is no
need to provide a specific algorithm on how to identify the disease. Neural
networks learn by example so the details of how to recognize the disease are not
needed. What is needed is a set of examples that are representative of all the
variations of the disease.
Neural Networks are used experimentally to model the human cardiovascular
system. Diagnosis can be achieved by building a model of the cardiovascular
system of an individual and comparing it with the real time physiological
measurements taken from the patient. If this routine is carried out regularly,
potential harmful medical conditions can be detected at an early stage and thus
make the process of combating the disease much easier.
An application developed in the mid-1980s called the "instant physician"
trained an auto-associative memory neural network to store a large number of
medical records, each of which includes information on symptoms, diagnosis,
and treatment for a particular case. After training, the net can be presented with
input consisting of a set of symptoms; it will then find the full stored pattern
that represents the "best" diagnosis and treatment.
Some of the applications are:
Medical Diagnosis
Detection and Evaluation of Medical Phenomena
Patient's Length of Stay Forecasts
Treatment Cost Estimation
4.3 IN BUSINESS
Business is a diverted field with several general areas of specialization such
as accounting or financial analysis. Almost any neural network application
would fit into one business area or financial analysis.
4.3.1 IN FINANCE
4.4 IN SCIENCE
Pattern Recognition
Recipes and Chemical Formulation Optimization
Chemical Compound Identification
Physical System Modeling
Ecosystem Evaluation
Polymer Identification
Recognizing Genes
Botanical Classification
Signal Processing: Neural Filtering
Biological Systems Analysis
Odor Analysis and Identification
4.5 IN EDUCATION
Teaching Neural Networks
Neural Network Research
College Application Screening
Predict Student Performance
4.6 ENERGY
Electrical Load Forecasting
Energy Demand Forecasting
Short and Long-Term Load Estimation
Predicting Gas/Coal Index Prices
Robots that can see, feel, and predict the world around them
Composition of music
Bibliography
Freeman, J.A. Simulating neural networks with Mathematica. 1994.
Freeman, J.A., and D.M. Skapura. Neural networks - Algorithms, applications, and programming.
1991.
ftp://ftp.sas.com/pub/neural/FAQ.html. n.d.
Gorman, and Sejnowski. 1988.
Hajek, M. NEURAL NETWORKS. 2005.
Hertz, J, A Krogh, and R.G. Palme. Introduction to the theory of neural computation. 1991.
Hopfield, and Tank. Biological Cybernetics. 1985.
Jordan, M. I. Attractor dynamics and parallelism in a connectionist sequential machine. 1986.
Josin. 1988.
Kohonen. 1982-90.
McCulloch, W. S, and W Pitts. A logical calculus of the ideas immanent in nervous. 1943.
Papert, and Minsky. Perceptrons. 1969.
Parker. 1985.
Rosenblatt, F. Principles of Neurodynamics.Spartan Books. New York, 1959.
Rumelhart, Hinton, and Williams. 1986.
Sejnowski, and Rosenberg. 1986.
Werbos. 1974.
Werbos, Parker, and Cun. 1974;1985;1985.
Widrow, and Hoff. Adaptive switching circuits. 1960.
Zurada, J.M. Introduction to artificial neural systems. 1992.