What is the difference between Ridge and Lasso regression?

regularization
machine_learning
ridge_regression
regression

#1

Hello,

I know that we apply ridge and lasso regression for the purpose of regularization. But I don’t understand the difference between these two. What are the conditions when these two should be applied?
What is the difference in relation to overfitting?

Thanks


#2

@mukesh,

These two techniques are advance methods of regression and both works well in case of higher number of parameters estimations and solve over fitting problem.

Ridge Regression: It is used to solve multi collinearity in OLS regression models through the incorporation of shrinkage parameter (it is vital in ridge regression).The assumptions for the model is same as OLS model like linearity, constant variance and independence and normality not need to be assumed.

Lasso Regression: It is more similar to Ridge Regression but perform automatic variable selection. It allows regression coefficient to be zero whereas Ridge does not.

Regards,
Mark


#3

Hi

I understand the difference between ridge and lasso. But how should I decide which technique should be applied where ?


#4

Hi @AishwaryaSingh,

As mentioned by @Mark, Ridge Regression reduces the model complexity by coefficient shrinkage, i.e. here the magnitude of the coefficients decreases, the values reaches to zero but not absolute zero.

In case of Lasso Regression, our coefficients reduce to absolute zero. Therefore, lasso selects only some feature while reduces the coefficients of others to zero. This property is known as feature selection and which is absent in case of ridge.

Lasso Regression is generally used when we have more number of features, because it automatically does feature selection. Whereas, if we have less number of features or we don’t want to loose any feature we can use Ridge Regression.