You are on page 1of 3

Roll No.

: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
Amrita Vishwa Vidyapeetham
B.Tech. Degree Examinations Nov/Dec. 2014
Seventh Semester
ECE457 Pattern Recognition Techniques and Algorithms
(Common to Electronics and Communication Engineering and
Electronics and Instrumentation Engineering)
Time: Three hours Maximum: 100 Marks
Answer all questions
Use of Standard Normal table is permitted

1. For what characteristics of data set, the following is true: 3 Marks


(a) MEAN MODE MEDIAN.
(b) MODE MEDIAN MEAN
(c) MODE MEDIAN MEAN

2. Find the mid- range for the following data: 1 Marks


X = {1, 3, 4, 8, 9, 15, 21}

3. Find mean deviations and Mean Absolute Deviations for the set of numbers:
X={12,14,5,7,8,15,32,21,54,67,98,87}. 2 Marks

4. For the following sample data construct a Normal plot, and check if the given sample data is
Normally distributed. 4 Marks

X = {35, 43, 46, 48, 51, 55, 58, 65}.

5. Given that a random variable X is Gaussian distributed, and that its true mean is known as .
Estimate sample variance using maximum likely-hood estimation. 6 Marks

6. Suppose that the values 2, 3, 4, 5, 6, 9,11 and 16 came from a uniform distribution. Use
method of moments to estimate the range of the random variable. 6 Marks

7. Classes A and B are Bivariate normally distributed with of 0,0,1,2,0


for class A and 2,0,1,1,0 for class B. P(A) = 2/5, P(b) = 3/5 and the cost of misclassifying an
A is three times that for B. Obtain an expression for optimal decision boundary. 8 Marks

8. In a data set of three samples from a population of As and Bs, P (A/x, y, z) was estimated to
be 0.1, 0.8, and 0.4. Estimate the error rate for the classifier. 6 Marks

9. Suppose that densities for classes A and B are as shown in Figure-1.The decision boundary is
placed at x=4.5, P(A) = , P(B) = . Calculate the error rate P(/A) , P(/B) and the
overall error rate P() of the classifier. 6 Marks

R Page 1 of 3
10. Show that nearest neighbor error rate is nearly twice as large as the Bayseian error rate for
low error probabilities. 8 Marks

11. Assume that the data points lie at x = 1,2,3,3. Estimate the density function at x = 2.5 using
rectangular window with a width of 2. 6 Marks

12. A sample from Class A is located at (x,y,z) =(1,2,3), a sample from class B is at ((7,4,5), and
a sample from class C is at (6,2,1). 8 marks
(a) How would a sample (3, 4, and 5) be classified using single nearest neighbor technique
and Euclidean distance?
(b) What if city block distance is used?

13. At some point in training an adaptive decision boundary, D = 2+3x-4x2+5y. We want D > 0,
for members of class A. The next sample is an A with x = -2, y = 3. What would D become
after adapting it for this sample, assuming c = 1 and k = 2? 6 Marks

14. Perform hierarchical clustering of the data shown in Table-1, using single linkage algorithm
using Euclidean distance. Show the distance matrices and the Dendogram with cluster
distance as the vertical axis. 8 Marks

Sample x y
1 0.0 0.0
2 0.5 0.0
3 0.0 2.0
4 2.0 2.0
5 2.5 8.0
6 6.0 3.0
7 7.0 3.0
Table-1

15. Design a neural network that classifies a sample belonging to class 1 if the samples produce
a positive value for D = 34+8x1 7x2+x3, and classifies the sample as belonging to class 0 if
the sample produce a negative value for D. 8 Marks

R Page 2 of 3
16. Explain the sequential Mean Square Error algorithm for designing a two layer neural
network with one output. 6 Marks

17. Consider the following (36) image with eight possible gray levels. Construct the gray level
histogram and the thresholded image obtained using the minimum point between the two
largest histograms as peaks. 4 marks

1 2 1 1 2 0
0 1 5 1 0 1
1 6 7 6 1 2

18. Erode the region consisting of 1s in the following image using the operator

*
The * denotes the origin of the operator. 4 marks

0 0 1 0 0 0
0 1 1 1 0 0
0 1 0 1 1 0

*****

R Page 3 of 3

You might also like