Why is the Sum of the squared components of each PCA is equal to 1

pca

#1
X <- cbind(Ag, Mining, Constr, Manuf, Manuf_nd, Transp, Comm,
           Energy, TradeW, TradeR, RE, Services, Govt)

#Descriptive statistics
summary(X)

# Principal component analysis should not be applied if the data is not highly correlated.
pca1 <- princomp(X, scores=TRUE, cor=T)
pca1_rot <- prcomp(X,center = T,scale. = T)
summary(pca1)
# Loadings of principal components
loadings(pca1)
#pca1$loadings:
sum(pca1$loadings[2,]^2);sum(pca1$lodings[1,]^2)

The above generates:

sum(pca1$loadings[2,]^2)
[1] 1
sum(pca1$loadings[1,]^2)
[1] 1
The PCA’s try to capture the maximum variance across the variables,and if we have normalized the data then the variances will be 1.Is this somehow related to the fact that the sum of the squared coefficients of each PCA comes out to be 1??


#2

@data_hacks-

PCA -is a mathematical procedure that uses an orthogonal transformation to convert a set of values of possibly M correlated variables into a set of K uncorrelated variables called principal components.

The number of principal components is always less than or equals to the number of original variables i.e. K less than or equals to M.

PCA can be done by eigen value decomposition of data co-variance matrix.

Loadings-the weights by which each standardized original variable should be multiply to get the component score. So if we square standard division of each term and add, it will be equals to the sum of each component which is equals to 1.

Hope this helps!

Regards,
Hinduja