I am aware that
K -Fold Cross validation is the technique where we keep a part of dataset and do not train on it and use that part for testing or validation.
Grid Search : This is used to tune our hyperparameters and get the best set of parameters
The question is how GridSearch and Cross Validation are inter linked. Is it necessary that I have to use both of them simultaneously or I can use them independently.
Please explain how this below line is evaluated :
grid = GridSearchCV(estimator=lasso_clf, model_grid, cv=LeaveOneOut(train.shape),scoring=‘mean_squared_error’)
Thanks in advance,