How are eigen values and eigen vectors calculated from a covariance matrix



In the above image S is a covariance matrix.The eigen(S) function calculates some values and vectors as shown.I am trying to understand how these are calculated.Can somebody please enlighten me on this.?


What is done:

The transformation matrix A preserves the direction of vectors in blue. The vectors in red are not parallel to either of the blue vector, so, their directions are changed by the transformation. Here, the blue vectors are the eigenvectors we aim to find.

How it’s done:

For any square matrix X, if it’s direction ( say y ) remains unchanged under a linear transformation (say A) then X:

(A* y) * X = (k * y) * X

Thus, (A* y - k * y) * X = 0

Therefore, (A* y - k * y) = 0


  1. det(A* y - k * y) = 0, gives the values of k i.e. the eigen-values.
  2. These values can then be back-substituted in (A* y - k * y) = 0 to obtain the value of y which is the eigen vector.

Why it’s done:
The above deals with the calculation of eigenvalues and eigenvectors, to obtain an understanding of what an eigenvector means to the data, refer to the below: