How to interpret the eigen vectors while doing PCA

eigen_vectors
r
pca
data_wrangling

#1

Hello,

While doing PCA,I created eigen vectors from the covariance matrix in R:

These are the first two columns of the result.How do I interpret them and what are they representing exactly?


#2

Let us first see what the vectors represent:
The first column is the vector that explains the maximum variance in the data and is called the principal component. The second column is the vector that is orthogonal to the principal component and explains the maximum variance possible.

If you have some problem understanding that, the following information about PCA should help:
Principal component analysis (PCA) is a statistical procedure that uses an orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components. The number of principal components is less than or equal to the number of original variables. This transformation is defined in such a way that the first principal component has the largest possible variance (that is, accounts for as much of the variability in the data as possible), and each succeeding component in turn has the highest variance possible under the constraint that it is orthogonal to the preceding components. The resulting vectors are an uncorrelated orthogonal basis set. The principal components are orthogonal because they are the eigenvectors of the covariance matrix, which is symmetric. PCA is sensitive to the relative scaling of the original variables.

Feel free to add another question if it is unclear.