You are on page 1of 4

Code.

No: 36026
R05 SET-1
JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY HYDERABAD
III B.TECH II SEM SUPPLEMENTARY EXAMINATIONS FEBRUARY - 2010
NEURAL NETWORKS
(COMPUTER SCIENCE & ENGINEERING)
Time: 3hours Max.Marks:80
Answer any FIVE questions
All questions carry equal marks
---

1. Write the various benefits of neural networks. Explain them in detail. [16]

2.a) Explain in detail about Bolltzmann learning.


b) Explain in detail about competitive learning. [8+8]

3.a) What is perceptron? Explain


b) Write about signal-flow graph of perceptron. [4+12]

4. Explain in detail about the following methods which are useful in improving back
propagation algorithm.
a) Tangent values
b) Normalizing the inputs. [8+8]

5.a) What is over fitting? Explain the effects of over fitting on generalization.
b) Explain variants of cross validation tool. [8+8]

6.a) Describe adaptive process of Self Organization Map.


b) What are the salient features of the Kohenen’s self organizing learning algorithm? [8+8]

7. Explain Neuro Dynamical models and compare them. [16]

8.a) What is the Hopfield network? Explain.


b) Describe how Hopfield network can be used to have analog to digital conversion. [4+12]

--ooOoo--
Code.No: 36026
R05 SET-2
JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY HYDERABAD
III B.TECH II SEM SUPPLEMENTARY EXAMINATIONS FEBRUARY - 2010
NEURAL NETWORKS
(COMPUTER SCIENCE & ENGINEERING)
Time: 3hours Max.Marks:80
Answer any FIVE questions
All questions carry equal marks
---

1. Explain the following:


a) Linear divergence
b) Exponential divergence. [8+8]

2. Write about the following:


a) Signal vector
b) Delta rule
c) Learning rate parameter. [5+5+6]

3.a) Explain Gauss-Newton’s method for unconstrained optimization.


b) Explain method of steepest descent for unconstrained optimization. [8+8]

4. Explain in detail about forward pass and backward pass of back-propagation. [16]

5. Explain in detail Hessian based Network pruning. [16]

6.a) Write about Kohenen model of self organized feature map.


b) Write short notes on learning vector quantization. [8+8]

7. Discuss about the stability property of the dynamical system taking an example. [16]

8. What is gradient type Hopfield network? Differentiate between discrete time Hopfield
network and gradient type Hopfield network. [16]

--ooOoo--
Code.No: 36026
R05 SET-3
JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY HYDERABAD
III B.TECH II SEM SUPPLEMENTARY EXAMINATIONS FEBRUARY - 2010
NEURAL NETWORKS
(COMPUTER SCIENCE & ENGINEERING)
Time: 3hours Max.Marks:80
Answer any FIVE questions
All questions carry equal marks
---

1. Briefly explain the following:


a) Closed loop operator
b) Unit delay operator
c) Linear system. [16]

2. Define the following:


a) Error-correction learning
b) Memory based learning
c) Hebbian learning
d) Boltzmann learning. [16]

3. Write about linearly separable patterns and non-linearly separable patterns in single layer
perception with examples. [16]

4. Explain in detail about forward pass and backward pass of back-propagation. [16]

5.a) What is over training? Explain the effects of over training on generalization?
b) Explain approximate smoother complexity-regularization. [8+8]

6.a) Explain Kohenen model of self organized feature map and compare it with Willshaw-
Vonder malsburg’s model.
b) What is the role of learning vector Quantizer in adaptive pattern classification and
explain in detail. [8+8]

7. Explain the mathematical model for describing the dynamics of a nonlinear system.
[16]

8.a) A Hopfield network made up of 5 neurons, which is required to store the following three
fundamental memories
ξ1 = {+1, +1, +1, +1, +1}
T

ξ 2 = {+1, −1, −1, +1, −1}


T

ξ3 = {−1, +1, −1, +1, +1}


T

Evaluate the 5-by-5 synaptic weight matrix of the network.


b) Contrast and compare a recurrent network configuration with a feed forward network.
[8+8]
--ooOoo--
Code.No: 36026
R05 SET-4
JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY HYDERABAD
III B.TECH II SEM SUPPLEMENTARY EXAMINATIONS FEBRUARY - 2010
NEURAL NETWORKS
(COMPUTER SCIENCE & ENGINEERING)
Time: 3hours Max.Marks:80
Answer any FIVE questions
All questions carry equal marks
---

1.a) Explain how neural networks can be used to represent knowledge.


b) Explain the concept of Pattern Recognition using Neural Networks. [8+8]

2.a) Explain the learning rule which is based on statistical mechanics.


b) Explain the learning rule which is operates on concept of “memorizing data”. [8+8]

3.a) Explain how learning curves are used to assess the performance of adaptive filters.
b) Explain about ensemble-averages learning curve. [8+8]

4.a) What is XOR problem? Explain


b) Explain how back propagation is able to solve XOR problem. [8+8]

5.a) Write about Quasi-Newton method for the training of multi layer perception.
b) What are the factors affecting composition of eigen values of multi layer perception
trained with Back propagation Algorithm.
c) What is windrow’s rule of thumb? Explain it. [8+4+4]

6. Write short notes on the following properties of feature map


a) Topological ordering
b) Density matching
c) Feature selection. [5+6+5]

7.a) Discuss Lypunov’s theorems on the stability and asymptotic stability of the autonomous
nonlinear dynamical system.
b) Discuss about attractors. [8+8]

8.a) Write about how to calculate the synaptic weights of the network using outer product
rule.
b) Use a Hopfield net to store four vectors as follows
A = (+1, -1, -1, +1, +1, -1, -1)
B = (+1, +1, -1, +1, +1, +1, -1)
C = (+1, +1, +1, -1, +1, -1, -1)
D = (+1, -1, -1, -1, -1, -1, -1)
Evaluate the synaptic weight matrix. [8+8]

--ooOoo--

You might also like