You are on page 1of 13

BACK PROPAGATION

Presented By: Richa Sharma M.Tech Cse

CONTENTS

Introduction Back propagation network Steps of back propagation learning Algorithm Error measure Advantages Disadvantages

WHY BACK PROPAGATION


Errors can only be computed at output layer and not at intermediate layers. The central idea behind back propagation is that the errors for the units for hidden layers are determined by back propagating the error from output to hidden layer.

BACK PROPAGATION OF ERRORS

Function Signals
Error Signals

BACK PROPAGATION
Idea of Back Propagation was presented by Rumelhart , Hinton and Williams in 1986. Back propagation is a common method of training artificial neural networks so as to minimize the objective function. . It is a supervised learning method, and is a generalization of the delta rule It is a systematic method of training multilayer artificial neural network.

BACK PROPAGATION NETWORK


A back propagation network consists of three layers of unit:

An input layer At least one intermediate layer; Hidden layer An output layer Units are connected in feed forward fashion with input units fully connected to hidden layer units which are in turn fully connected to output units.

DIAGRAM

STEPS IN BACK PROPAGATION LEARNING

Input pattern in training set is applied to input units and then propagated forward. Pattern of activation arriving at the output layer is compared to the correct output pattern to calculate error signal. The error signal for each target output is then back propagated from outputs to inputs in order to adjust the weights in each layer. After learning the correct classification for a set of inputs, it can be tested on other sets to see how well it classifies untrained patterns.

ALGORITHM
Until Convergence (low error or other stopping criteria) do Present a training pattern Calculate the error of the output nodes (based on T - O) Calculate the error of the hidden nodes (based on the error of the output nodes which is propagated back to the hidden nodes) Continue propagating error back until the input layer is reached Update all weights based on the standard delta rule with the appropriate error function E.

ERROR MEASURE
It is the difference between the actual output(O) of the network with the target output(T). Error function is defined as:

E=0.5

( )

Where k is the set of input values

ADVANTAGES

Relatively simple implementation Standard method and generally works well

Limitations

Slow and inefficient Can get stuck in local minima resulting in sub-optimal solutions. The convergence in back propagation learning is not guaranteed.

APPLICATIONS

Used in complex function approximations, feature extraction & classification, and optimization & control problems. Applicability in all areas of science and technology.

THANKYOU

You might also like