I have been using R for some time now and have built various predictive model on it. I have recently started competing on Kaggle competitions and found out that I can not process data more than 3 GB in size. Upon re-searching I found out that R loads data in its RAM and hence this is driven by RAM of my machine.
Is there any way I can overcome this limitation? How? Some of the Kaggle competitions have had data more than 20 GB and I have seen people using R to solve those problems.
Any help would be greatly appreciated!