How to resolve error in allocation of memory while applying linear regression?

memory_allocation
r

#1

Hi everyone,

I am working on a dataset with 25 variables and around 1000000 rows. When I applied linear regression :
lmao<-lm(Sales~.,data=combined)
It gave me the error :

     Error: cannot allocate vector of size 4.1 Gb
    In addition: Warning messages:
    1: In model.matrix.default(mt, mf, contrasts) :
      Reached total allocation of 4027Mb: see help(memory.size)
    2: In model.matrix.default(mt, mf, contrasts) :
      Reached total allocation of 4027Mb: see help(memory.size)
    3: In model.matrix.default(mt, mf, contrasts) :
      Reached total allocation of 4027Mb: see help(memory.size)
    4: In model.matrix.default(mt, mf, contrasts) :
      Reached total allocation of 4027Mb: see help(memory.size)

My configuration is 64 bit Windows 10 with a 4GB RAM. Please suggest how can I optimise the memory space used by the model.

Thanks in advance!


#2

Some of these threads might help:

http://discuss.analyticsvidhya.com/t/what-are-the-ways-to-handle-huge-data-in-r/141

http://discuss.analyticsvidhya.com/t/r-memory-issue-on-allocating-large-size-data-to-a-vector/3147

Given this is linear regression, you can even try SparkR

Regards,
Kunal