You are on page 1of 5

Code No: 07A80407

R07

Set No. 2

IV B.Tech II Semester Examinations,APRIL 2011 ARTIFICIAL NEURAL NETWOEKS Common to Electronics And Telematics, Electronics And Communication Engineering Time: 3 hours Max Marks: 80 Answer any FIVE Questions All Questions carry equal marks

1. (a) With suitable diagram explain the architecture complete counter propagation network. (b) Describe the data structures used for Adaline and Madaline simulators. [8+8] 2. (a) Explain advantages and disadvantages of ART network. (b) Discuss signicance of the following for the ART network: i. Reset ii. Vigilance iii. Gain1 and Gain2.

[8+8]

3. Explain and discuss Recurrent Neural Network (RNN)? For what type of problems RNNs are most suitable. [16] 4. (a) Derive the weight update equation for discrete Perceptron and write its summary algorithm. (b) Explain the limitations of backpropagation learning. Also explain the scope to over come these limitations. [8+8] 5. Give the statement of optimization problem with equality constraints and explain how it can be solved using Hopeld neural network. [16] 6. (a) Explain the biological prototype of neuron. Also explain the characteristics of neuron. (b) List and explain the various activation functions used in modeling of articial neuron. Also explain their suitability with respect to applications. [8+8] 7. (a) Explain the concept of Hebbian learning principle and its mathematical modeling. (b) Given are a set of input training vectors and initial weight vector. The learning constant is assumed to be 0.1. The desired responses for X1 , X2 and X3 are d1 =-1, d2 =-1 and d3 =1 respectively for a bipolar binary case. X1 =[1, 2 ,0, 1]T , X2 = [0, 1.5, -0.5, -1.0]T and X3 = [-1, 1, 0.5, -1]T . W0 = [1,-1,0, 0.5]T . With Widrow-Ho learning rule evaluate weight vector after completion of one cycle of training. [8+8]

Code No: 07A80407

R07

Set No. 2

8. Consider a Kohonen net with two cluster units and ve input units. The weight vector for the cluster units are: W1 = (0.1 0.3 0.5 0.7 0.9) and W2 = (0.9 0.7 0.5 0.3 0.1) Using Euclidian distance nd the winning cluster unit for the input pattern. [16]

Code No: 07A80407

R07

Set No. 4

IV B.Tech II Semester Examinations,APRIL 2011 ARTIFICIAL NEURAL NETWOEKS Common to Electronics And Telematics, Electronics And Communication Engineering Time: 3 hours Max Marks: 80 Answer any FIVE Questions All Questions carry equal marks

1. (a) Explain the architecture of self-organizing map network. (b) Explain the training algorithm of Kohonens layer training algorithm. [16]

2. What is a Counter Propagation Network (CPN). Compare its advantages over BPA. [16] 3. (a) Describe the Hopeld neural network models. (b) Explain the concept of stability in neural networks and discuss the stability of Hopeld networks. [8+8] 4. (a) Explain the structure and working of a neuron with help of suitable diagrams. (b) Explain in detail the properties of biological neuron. (c) Compare : biological neuron and articial neuron. 5. (a) Write Hebbian learning rule. (b) Write perceptron learning rule. [8+8] [6+5+5]

6. Construct a Hopeld network to associate 33 input images with dots and dashes.What are the limitations of Hopeld network. [10+6] 7. Use outer products rule to nd weight matrix in bipolar form for the BAM Network based on the following binary input-output vector pairs. s(1) = (1 0 0 1) t(1) = (0 , 1) s(2) = (1 0 1 0) t(2) = (0 , 1) s(3) = (1 1 0 0 ) t(3) = (1 , 0) s(4 ) = (1 1 1 1 ) t(4) = (1 ,0). [16] 8. (a) Briey discuss about the sequential and batch modes of training in a backpropagation algorithm and also stopping criteria. (b) Briey explain about few applications of backpropagation. [8+8]

Code No: 07A80407

R07

Set No. 1

IV B.Tech II Semester Examinations,APRIL 2011 ARTIFICIAL NEURAL NETWOEKS Common to Electronics And Telematics, Electronics And Communication Engineering Time: 3 hours Max Marks: 80 Answer any FIVE Questions All Questions carry equal marks

1. What are Kohonens self organizing maps? Explain the architecture and the training algorithm used for Kohonens SOMs. [16] 2. (a) Distinguish between static neural networks and dynamic neural networks with examples. (b) Explain applications of Hopeld networks and mention its limitations. [8+8] 3. With a schematic two-layer feed forward neural network, derive its learning algorithm. Also discuss the learning issues of backpropagation. [16] 4. (a) Explain the following words: i. ii. iii. iv. Recall Autoassociation Classication Recognition. [8+4+4]

(b) What is dierence between learning & training. (c) Explain types of learning rule. 5. (a) Explain the structure of the brain and its organization. (b) With suitable diagrams explain the model of articial neuron and also explain the important activation functions used in ANN. [8+8] 6. (a) Explain characteristic and application of ART network. (b) With a neat architecture, explain the training algorithm used in ART network. [8+8] 7. Discuss how a particular neural network is selected for a particular problem, viz., optimization problem, pattern recognition problem and classication problem. [16] 8. (a) Explain briey about the counter propagation trainning algorithm. (b) Explain the architecture of Grossberg layer and its learning algorithm. [8+8]

Code No: 07A80407

R07

Set No. 3

IV B.Tech II Semester Examinations,APRIL 2011 ARTIFICIAL NEURAL NETWOEKS Common to Electronics And Telematics, Electronics And Communication Engineering Time: 3 hours Max Marks: 80 Answer any FIVE Questions All Questions carry equal marks

1. (a) What are the three models of articial neuron? Explain them in detail. (b) Compare and contrast articial neural networks with conventional computer system. [8+8] 2. Explain the function of ART network and explain its operation with relevant Equations. [16] 3. (a) What are the assumptions made in McCulloch-Pitts theory? Explain. (b) What are dierent types of learning schemes used in training of articial neural networks? Explain each of them clearly. [6+10] 4. Explain Kohonens self-organized feature map algorithm and mention its applications. [16] 5. (a) Explain the architecture and training method of self-organizing map network. (b) Explain the Grossberg layer training algorithm. [8+8]

6. (a) Explain how pattern mode and batch mode of training aect the result of backpropagation learning. (b) What is the signicance of momentum term in the backpropagation learning (c) Explain the renements of the backpropagation learning and also the interpretation of the result of the learning [5+5+6] 7. Discuss how ART network can be used for (a) image processing (b) Chavadis recognision. 8. (a) Explain Hopeld neural network from fundamentals. (b) Discuss the capacity of Hopeld neural network. [8+8] [8+8]

You might also like