How cross validation helps in SVM classification

svm
r
crossvalidation

#1

I am currently studying about SVM classification and while using it, I came across a parameter cost which is needed to define while creating model. I have taken any arbitrarily cost value. I read that it can be checked by cross validation but I am not able perform it.

set.seed(1)
x<-matrix(rnorm(20*2),ncol=2)
y<-c(rep(-1,10),rep(1,10))
x[y==1,]=x[y==1,]+1
plot(x,col=(3-y))
dat<-data.frame(x=x,y=as.factor(y))
require(e1071)
svmfit=svm(y~.,data=dat,kernel=“linear”,cost=10,scale=FALSE)

Here, I have created the model with cost value of 10. I want to check with cross validation that is it best value or not.


#2

Hi @hinduja1234,

Is it implying that you can arrive at the best cost value, by checking the cross validation data set evaluating parameter say RMSE or accuracy. Like say your evaluation parameter is RMSE, so whichsover cost value gives the lowest RMSE on cross validation dataset will be the best cost value.

Let me know if it fits the context.

Regards,
Aayush


#3

You can actually try out different ranges for gamma and ask SVM to show the best model

tune.out=tune(svm,y~.,data=dat,kernal=“radial”,ranges=list(cost=c(0.001 , 0.01, 0.1, 1,5,10,100),gamma=c(1,2,3,0.5)))
summary(tune.out)
bestmod=tune.out$best.model
summary(bestmod)

This shows you the best model with different values of gamma for the radial kernel.