Eigenvectors of a symmetric matrix
If \(A\in R^{n\times n}\) is a nonzero symmetric matrix, denote \(\lambda=\sup_{y:\|y\|=1} \|Ay\|\). By compactness, let \(x\in R^n\) with \(\|x\|=1\) be such that \(\|Ax\| = \lambda\). Then \[\begin{align} \|A(Ax + \lambda x) - \lambda (Ax + \lambda x)\|^2 &= \|A^2x - \lambda^2 x\|^2 \\&= \|A^2x\|^2 - 2\lambda^2\|Ax\|^2 + \lambda^4 \\&\le 0. \end{align}\] Either \(Ax+\lambda x\) is nonzero eigenvector of \(A\), or \(Ax = -\lambda x\) is a nonzero eigenvector of \(A\). This proves that \(A\) has at least one eigenvector.
To get the full spectral theorem, proceed by induction in the usual way: Assume that all matrices \(B\) of rank \(r-1\) admits a family of orthonormal eigenvectors \(u_1,...,u_{r-1}\) with eigenvalues \(\lambda_1,...,\lambda_{r-1}\) such that \(B = \sum_{i=1}^{r-1} \lambda_i u_i u_i^T\) and such that each \(u_i\) is orthogonal to \(\ker B\).
Now if \(A\) is of rank \(r\), let \(u_r\) be a nonzero eigenvector of \(A\) with eigenvalue \(\lambda_r\) given by the first paragraph. Let \(B=(A - \lambda_r u_r u_r^T)\). Then \(\ker B = \ker(A) \oplus \mathrm{span}(u_r)\) and applying the induction hypothesis to \(B\)