This document contains two sets of questions (Set A and Set B) for a statistical methods quiz. Set A includes questions about: (1) finding vectors to satisfy an equation in a subspace, (2) finding eigenvalues and eigenvectors of a matrix, (3) finding an orthogonal projection onto a plane, (4) finding the rank of matrices, (5) explaining augmented feature and weight vectors for classification, and (6) explaining gradient descent algorithms. Set B contains similar questions about subspaces, eigenvalues/eigenvectors, orthogonal projections, matrix ranks, the conditions for linear functions, and the importance of learning rate and margins in perceptron algorithms.
This document contains two sets of questions (Set A and Set B) for a statistical methods quiz. Set A includes questions about: (1) finding vectors to satisfy an equation in a subspace, (2) finding eigenvalues and eigenvectors of a matrix, (3) finding an orthogonal projection onto a plane, (4) finding the rank of matrices, (5) explaining augmented feature and weight vectors for classification, and (6) explaining gradient descent algorithms. Set B contains similar questions about subspaces, eigenvalues/eigenvectors, orthogonal projections, matrix ranks, the conditions for linear functions, and the importance of learning rate and margins in perceptron algorithms.
This document contains two sets of questions (Set A and Set B) for a statistical methods quiz. Set A includes questions about: (1) finding vectors to satisfy an equation in a subspace, (2) finding eigenvalues and eigenvectors of a matrix, (3) finding an orthogonal projection onto a plane, (4) finding the rank of matrices, (5) explaining augmented feature and weight vectors for classification, and (6) explaining gradient descent algorithms. Set B contains similar questions about subspaces, eigenvalues/eigenvectors, orthogonal projections, matrix ranks, the conditions for linear functions, and the importance of learning rate and margins in perceptron algorithms.
Set A 1) Let V amd W be the subspaces of R^2 spanned by (1, 1) and (1, 4), respectively. Find vectors v (belongs to) V and w (belongs to) W so v + w = (2, 3). 2) Find the eigenvalues and eigenvectors of the following: M
= -2 2
3 -1
3) Find the orthogonal projection of point P (5,-6,3) on the plane 3x 2y + z = 2.
4) Find the rank of the following: a. M1 = 1 2 3 0 7 9 b. M2 = 1 2 1 2 3 1 1 1 0 5) With respect to linear classification, for a D dimentional feature vector, what is augmented feature vector and augmented weight vector? Explain the importance of this augmentation for classification. 6) Write the basic gradient descent algorithms and explain parameters and variables involved. What is the error function in case of single sample/batch perceptron algorithms with relaxation. ------------------------------------- ANSWERS ----------------------------------------
Statistical Methods in AI Monsoon 2014
Quiz-1, 22nd August Set B 1) Let V amd W be the subspaces of R^2 spanned by (1, -1) and (2, 1), respectively. Find vectors v (belongs to) V and w (belongs to) W so v + w = (1, 1). 2) Find the eigenvalues and eigenvectors of the following: M
4 -1
2 1
3) Find the orthogonal projection of point P(7,-2,8) on the plane x y + 3z = 1.
4) Find the rank of the following: a. M1 = -1 8 3 8 1 19 b. M2 = 1 2 1 2 3 1 1 1 2 5) What is the condition for linearity of a function "f", whose domain dimension is D ? Is the discriminant function g(x1, x2) = w1.x1 + w2.x2 + w0 (where w1, w2, w0 are the parameters) linear? 6) What is the importance of learning rate in perceptron learning algorithms. What is margin in relaxation with margin algorithm and what does it signify. ------------------------------------- ANSWERS ----------------------------------------