You are on page 1of 6

Googles Page Rank

Cumpute the probability that a random surfer will end up at a particular webpage. Ned to know the number of webpages there are tracking sites suppose there are 5 websites: 5 1 4 Arrows represent page j (beginning) links to page i

Make a 5x5 matrix

A=(Ai,j)5 I,j =1

where Ai,j

1 if page j links to page i 0 otherwise

For our example 0 1 0 1 1 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 1 0 0 0 0

A=

We want to make a new matrix obtained by dividing each entry by the sum of all other entriws in the same column Problem: What if page j doesnt link to anything? Then you get a column of zeros Solution- change all entries in that colum to one to get a matrix A

A =

0 1 0 1 1

0 0 1 0 0

0 1 0 1 0

1 1 1 1 1

1 0 0 0 0

Now divide the entries by column sums to get B

B=

0 1/3 0 1/3 1/3

0 0 1 0 0

0 1/2 0 1/2 0

1/5 1/5 1/5 1/5 1/5

1 0 0 0 0

We assume that our random sufer has a certain tolerance for just clicking on links

Here, we assume it is .85 This means with probability .15 (15%) our random surfer gets bored and start at a new webpage

Replace B with the matrix C ( )


i,j

Ei,j = matrix with 1 in (i,j) position,zero in every other portion n = number of web pages

C = .85B + .03

=(.85/3) + .03 =(.85/3) + (.09/3) =(.49/3)

We get .03 .94/3 .03 .94/3 .94/3 .03 .03 .88 .03 .03 .03 .455 .03 .455 .03 .2 .2 .2 .2 .2 .88 .03 .03 .03 .03

C=

Claim = 1 is an eigenvalue for C Note that this is not so clear! It is obvious, though, that = 1 is an eigenvalue for Ct
t

For C , lets take the vector

1 1 1 1 1

Ct

C=

.03 .94/3 .03 .94/3 .94/3

.03 .03 .88 .03 .03

.03 .455 .03 .455 .03

.2 .2 .2 .2 .2

.88 .03 .03 .03 .03

Therom: If is an eigenvalue for At, then is an eigenvalue for A The eigenvalues may be different! Therefore, C has eigenvalue =1

Perron Frobenius Theorem: Let A be the matrix with all positive entries. If is the largest eigenvalues of A, then is positive and admits an eigenvector with all positive entries. This theorem has content since a matrix with all positive entries is not necessarily positive definite!

Example: A = 1 2 1 1

eigenvalue of A

= 1 2

(missed a bunch of math work) Then the Page Rank of website I is the entry wi Google search will order sites by Page Rank

Exam covers 4.8, 5.1-5.4, PageRank, Norms of Matrices, Polar Decomposition. THINGS TO KNOW Section 4.8 - the definition - what a linear difference equation is and how to show a given signal satisfies an equation - what a homogeneous linear difference equation is o a0yk+n + a1yk+(n-1) + .+ akyk + ak+1yk-1 = 0 - the collection of all signals satisfying a given homogeneous linear difference equation is a subspace know how to final the dimension. - What the Casorati matrix is and how to find it; how to use it to show linear independence of signals o 2 signals: casoati matrix is 2x2 o 3 signals: 3x3 o 4 signals: 4x4 o n signals: nxn example of Casorati: yk = ykcos((k*pi)/2) zk = 4ksin((k*pi)/2) yk+2 + 16yk = 0 Does { (xk ), (zk )} form a basis for the space of signal satisfying yk+2 + 16yk = 0 Casorati Matrix: 2x2 [ xk xk+1 = [ zk zk+1 ]

4kcos((k*pi)/2) 4ksin((k*pi)/2) k+1 k+1 4 cos(((k+1)*pi)/2) 4 sin(((k+1)*pi)/2) ]

check signals satisfy the equation: 4k+2 cos(((k+2)*pi)/2) + 16*4k cos((k*pi)/2)

42*4k =4k+2 = 4k+2(cos(((k+2)*pi)/2) + cos((k*pi)/2) cos(pi+((k*pi)/2) = -cos ((k*pi)/2) = 4k+2(-cos ((k*pi)/2) + cos ((k*pi)/2) =0 Now: =4k+2 sin(((k+2)*pi)/2) + 16*4k sin((k*pi)/2) =4k+2(sin(((k+2)*pi)/2) + sin((k*pi)/2) =4k+2(-cos ((k*pi)/2) + cos ((k*pi)/2) =0 Now that you know the signals satisfy the same equation plug a single value of K (usually K=0) into the casorati matrix and check whether it is invertible k=0 =[

cos(0) sin(0) 4cos(pi/2) 4sin(pi/2) ]

=[1 0 0 4] Determinant is 4, so the matrix is invertible , which means the signals are linearly independent! the subspace of all solutions to: yk+2 + 16yk = 0 has dimensions 2 since we could write the equations as yk+2 + 0 * yk+1 + 16yk = 0 Therefore { (xk ), (zk )} form a basis for the space of signal satisfying yk+2 + 16yk = 0

Chapter 5 - what an eigenvalue is and how to find them: o det(A- I) = 0 is one equivalent condition. - What an eigenvector associated to a given eigenvalue is and how to find them: o eigenvector associated to : Av = v for v nonzero - for a given , A, all vectors v satisfying Av= v is a subspace - Definition of the characteristic polynomial = det(A- ) - what it means to diagonalize a matrix - how to use eigenvalues and eigenvectors of a matrix to diagonalize S-1AS = D o D = matrix of eigenvalues, S = matrix of eigenvectors - Definitions of orthogonal and positive semi-definite matrices and what these give you in terms of eigenvalues

o Positive semi-definite: (Av)*v 0 for all vectors v (A is nxn, v is in n) [ is fancy R] Eigenvalues are all non-negatives Orthogonal matrix: AtA = I o A= [ 1/2 1/2 -1/2 1/2 ] At = [ 1/2 1/2 AtA = [ 1 0 0 1] how to find the formula for the nth power of a diagonalizable matrix o A = SDS-1 An = SDnS-1 Any symmetric matrix is diagonalizable via s = orthogonal. -1/2 1/2 ] Eigenvalues in { 1}

PageRank - how to find it! - dont need to check that = 1 is an eigenvalues, but know that it always is. - Once you have v = [v1, v2, .vn] the eigenvector with al positive entries associated to = 1, the PageRank of Pk is Vk something here finish it Norms and Polar Decomposition - The definitions - How to find using eigenvalues and eigenvectors - Polar decomposition: only for nxn matrices, need both |A| and W

You might also like