Ridge regression using glmnet in R




I am a little new and learning about ridge and lasso regression.
Do we need to pass data only as matrices in the glmnet() function while performing the ridge regression?
So in the format glmnet( x , y, family=“gaussian”, alpha=0, lambda=0.001) does both x and y need to be matrices and is y the predictor variable or x?
What do the additional parameters that we specify do?



This is one way to use it:





explanation of the parameters is as follows:

x is a matrix (of predictors). model.matrix() turns a dataframe into a matrix.
y is a vector (the response variable)
alpha :you need to set this to 0 for ridge regression and 1 for lasso. You can also set it to a value between 0 and 1.
s: you can set this to ‘lambda.min’ or ‘lambda.1se’. If you set it to lambda.min then the regularization parameter is set to value which shows the minimum error on cross validation (done automatically by glmnet as your model is based on cv.glmnet). If you set it at ‘lambda.1se’ then it is set at one standard error higher than that required for minimum error.


I used the code in the same way that you provided.

After this command,


i am getting an error saying

Error in as.matrix(cbind2(1, newx) %*% nbeta) :
error in evaluating the argument ‘x’ in selecting a method for function ‘as.matrix’: Error in t(.Call(Csparse_dense_crossprod, y, t(x))) :
error in evaluating the argument ‘x’ in selecting a method for function ‘t’: Error: Cholmod error ‘X and/or Y have wrong dimensions’ at file …/MatrixOps/cholmod_sdmult.c, line 90

What is causing this and how do I remove it?


I’m not sure. Perhaps you should examine x and new.x to see if matrices are being created as expected.