(1) Sec 5.2: 3, 11, 14 (2) Sec 5.3: 1 (with justifications), 2(e,f), 5, 6, 10(a), 23 (3) In this question we ask you to fill in some of the missing parts of the proof of Thm 1 in the on-line Lecture 33. In the proof of Thm 1, we assumed that if a probability matrix P had the property that all entries of P^k were positive for k large enough, then all eigenvectors v of P satisfying P*v = v are multiples of the same vector, whose entries are positive and sum to 1. Prove this by filling in the following steps. Recall that E_lambda(A) = nullspace(A-lambda*I). (1) If dim(E_1(P^k))=1, i.e. all eigenvectors of P^k for the eigenvalue 1 are multiplies of the same vector v, then all eigenvectors of P for the eigenvalue 1 are multiplies of the same vector v. (2) If any matrix A with eigenvalue lambda satisfies dim(E_lambda(A))=1, then the same is true of A^t. (3) Show dim(E_1((P^k)^t)) = 1 as follows. Let A = (P^k)^t and let v be an eigenvector: A*v=v. Let v_k be a largest component of v in absolute value. Let w=v/v_k, so w_k = 1. Then the k-th component of A*w = w is sum_{i=1 to n} A_ki*w_i = w_k = 1 Take absolute values and use the triangle inequality and Gershgorin's Theorem to conclude that all w_i = 1, and therefore dim(E_1((P^k)^t)) = 1 as desired. (4) Combine steps (1), (2) and (3) to conclude that dim(E_1(P))=1, i.e. that all eigenvectors of P for the eigenvalue 1 are multiples of the same vector v. (5) It remains to show that if P*v=v, then we can choose v so that all entries are positive and sum to 1. Use the LU decomposition of P-I to show that v may be assumed to be real (eigenvectors of real matrices don't all have to be real: think about eigenvectors for complex eigenvalues). By part (1) we can use the equation P^k*v=v. Use proof by contradiction as follows: Suppose v has p>0 positive and n-p>0 nonpositive entries. Let Q be a permutation matrix such that the first p entries of w = Q*v = [ w1 ] p [ w2 ] n-p namely w1, are positive, and the rest, w2, are nonpositive. Let p n-p B = Q*(P^k)*inv(Q) = [ B11 B12 ] p [ B21 B22 ] n-p and confirm that B*w = w and that this is equivalent to B11*w1 + B12*w2 = w1 B21*w1 + B22*w2 = w2 Let u be a vector of all ones of dimension p, so that (*) u^t*B11*w1 + u^t*B12*w2 = u^t*w1 Use (*), the fact that each entry of B is positive, and the fact that B has column sums equal to 1, and that w1 has positive entries and w2 nonpositive entries, to argue that the left hand side of (*) is strictly less than the right hand side, a contradiction. Conclude that all entries of v have the same sign, and so can be chosen positive and summing to 1. (6) Give a simple example to show that unless we assume all the entries of P^k are positive for some k, the result is false: P*v=v can have many linearly independent solutions, all with positive entries. (4) We again fill in parts of the proof of Thm 1. We want to show that P has a eigenvalue at 1 of multiplicity 1, and the other eigenvalues are all strictly less than 1. We again need to assume that the entries of P^k are all positive for some k large enough. (1) Use the fact that all entries of P^k are positive and Gershgorin's Thm to conlude that any eigenvalue of P^k must either equal 1 or be strictly less than 1 in absolute value. (2) Use the fact that if all entries of P^k are positive, then so are the entries of P^m for all m>=k, and the fact that the eigenvalues of P^m are the m-th powers of the eigenvalues of P, to conclude that any eigenvalue of P must either equal 1 or be strictly less than 1 in absolute value. Argue by contradiction: If P had an eigenvalue of absolute value 1, but not equal to 1, then for some arbitrarily large m, P^m will have an eigenvalue of absolute value 1, but not equal to 1. (5) Sec 5.4: 18 (6) Sec 6.1: 1 (with justifications), 6, 7, 11, 12, 13, 15, 20, 24(a,d), 25 (7) Proof that = sum_{i=1 to n} 2*x_i*y_i + sum_{i=1 to n-1} x_i*y_{i+1} + sum_{i=1 to n-1} y_i*x_{i+1} is an inner product on R^n. Hint: To show > 0 unless x=0, write as a sum of squares.