Statistics - Why squaring is used to calculate error in RMSE

linear_regression
statistics
rms

#1

In linear regression, the way root mean square is calculated is by:

  1. Taking the distance of the point from the line
  2. Squaring the values (to get positive values)
  3. Taking the mean
  4. And taking square root

Why absolute values are not taken (which returns positive values) instead of squaring the values and then taking square root?


#2

Hi @sonicboom8,

Based on my knowledge, RMSE is more popular than Mean Absolute error because:

  1. The RMSE penalises high error values more compared to MAE. This is helpful in particularly making the regression line sensitive to outliers.

  2. The MAE is not differentiable but RMSE is, which makes it useful for finding gradient descent.


#3

Thanks sauravkaushik8


#4

@sonicboom8
Check this link out! It explained well!
https://www.quora.com/Why-do-we-square-instead-of-using-the-absolute-value-when-calculating-variance-and-standard-deviation