Classification Problem with 1M rows in training set



My data has 20 variables and 1M observation in training set.
I tired random forest in R on the data-set and got following error and warning msg
"Error: cannot allocate vector of size 6.8 Gb
In addition: Warning messages:
1: In array(rfout$treemap, dim = c(2, nrnodes, ntree)) :
Reached total allocation of 32722Mb: see help(memory.size)"

there is a package bigrf for larger dataset but coudnt install in windows environment.

can anybody help me in this, if faced similar issue.
shall i go with svm or some other algo, please explain how to set parameters for that.



One quick solution, take a random sample of data set and build model. Other way, move to unix environment. It has better memory utilization compare to windows.