lambda2 = min_{v != 0, v'*v1 = 0} v'*A*v / v'vand the minimizing v is v2.
Proof. Substitute the eigendecomposition A = Q*Lambda*Q', where Q is a orthogonal matrix whose columns v(i) are eigenvectors, and Lambda = diag(lambda1, ..., lambdan) is a diagonal matrix of eigenvalues, into the expression in the theorem:
v'*A*v min{v!=0, v'*v1 = 0} ------ v'v v'*Q*Lambda*Q'*v = min{v!=0, v'*v1 = 0} ---------------- v'*v v'*Q*Lambda*Q'*v = min{Qv!=0, v'*Q*Q'*v1 = 0} ---------------- v'*Q*Q'*v y'*Lambda*y = min{y!=0, y'*y1 = 0} ----------- y'*y where y = Q'*v and y1 = Q'*v1 = [1,0,...,0]' lambdai*y(i)2 = min{y!=0, y(1) = 0} sumi ------------- sumi y(i)2It is easy to see that this expression is minimized by taking y = [0,1,0,...,0]', yielding lambda2. Then v = Q*y = v2, as desired. QED