With reference to the post “Learn Gradient Boosting Algorithm for better predictions (with codes in R)”.
I am trying to run the code mentioned in the blog,but i am not able to do so because of memory limitation.
I am running it on 64 bit machine with 8 GB RAM and enough space of 59 GB in the C drive.
library(caret) library(Metrics) setwd("C:/Users/mcvi/Desktop/Modelling/Analytics Vidya/Datascience 3x/Raw data") complete <- read.csv("train.csv", stringsAsFactors = TRUE) train <- complete[complete$Disbursed == 1,] score <- complete[complete$Disbursed != 1,] set.seed(999) ind <- sample(2, nrow(train), replace=T, prob=c(0.60,0.40)) trainData<-train[ind==1,] testData <- train[ind==2,] set.seed(999) ind1 <- sample(2, nrow(testData), replace=T, prob=c(0.50,0.50)) trainData_ens1<-testData[ind1==1,] testData_ens1 <- testData[ind1==2,] table(testData_ens1$Disbursed)/ nrow(testData_ens1) fitControl <- trainControl(method = "repeatedcv", number = 4, repeats = 4) trainData$outcome1 <- ifelse(trainData$Disbursed == 1, "Yes","No") set.seed(33) memory.limit(size=90000) gbmFit1 <- train(as.factor(outcome1) ~ ., data = trainData[,-26], method = "gbm", trControl = fitControl,verbose = FALSE)
while running the above line I am getting error as “Error: cannot allocate vector of size 56.4 Gb”.
Let me if i am doing anything wrong in the code ?