My data has 20 variables and 1M observation in training set.
I tired random forest in R on the data-set and got following error and warning msg
"Error: cannot allocate vector of size 6.8 Gb
In addition: Warning messages:
1: In array(rfout$treemap, dim = c(2, nrnodes, ntree)) :
Reached total allocation of 32722Mb: see help(memory.size)"
there is a package bigrf for larger dataset but coudnt install in windows environment.
can anybody help me in this, if faced similar issue.
shall i go with svm or some other algo, please explain how to set parameters for that.