r - Memory Allocation "Error: cannot allocate vector of size 75.1 Mb" -
in course of vectorizing simulation code, i've run memory issue. i'm using 32 bit r version 2.15.0 (via rstudio version 0.96.122) under windows xp. machine has 3.46 gb of ram.
> sessioninfo() r version 2.15.0 (2012-03-30) platform: i386-pc-mingw32/i386 (32-bit) locale: [1] lc_collate=english_united kingdom.1252 lc_ctype=english_united kingdom.1252 [3] lc_monetary=english_united kingdom.1252 lc_numeric=c [5] lc_time=english_united kingdom.1252 attached base packages: [1] stats graphics grdevices utils datasets methods base other attached packages: [1] matrix_1.0-6 lattice_0.20-6 mass_7.3-18 loaded via namespace (and not attached): [1] grid_2.15.0 tools_2.15.0
here minimal example of problem:
> memory.limit(3000) [1] 3000 > rm(list = ls()) > gc() used (mb) gc trigger (mb) max used (mb) ncells 1069761 28.6 1710298 45.7 1710298 45.7 vcells 901466 6.9 21692001 165.5 173386187 1322.9 > n <- 894993 > library(mass) > sims <- mvrnorm(n = n, mu = rep(0, 11), sigma = diag(nrow = 11)) > sims <- mvrnorm(n = n + 1, mu = rep(0, 11), sigma = diag(nrow = 11)) error: cannot allocate vector of size 75.1 mb
(in application covariance matrix sigma not diagonal, same error either way.)
i've spent afternoon reading memory allocation issues in r (including here, here , here). i've read, impression it's not matter of available ram per se, of available continuous address space. still, 75.1mb seems pretty small me.
i'd appreciate thoughts or suggestions might have.
r has gotten point os cannot allocate another 75.1mb chunk of ram. size of memory chunk required next sub-operation. not statement amount of contiguous ram required complete entire process. point, available ram exhausted need more memory continue , os unable make more ram available r.
potential solutions manifold. obvious 1 hold of 64-bit machine more ram. forget details iirc on 32-bit windows, single process can use limited amount of ram (2gb?) , regardless windows retain chunk of memory itself, ram available r less 3.4gb have. on 64-bit windows r able use more ram , maximum amount of ram can fit/install increased.
if not possible, consider alternative approach; perhaps simulations in batches n per batch smaller n
. way can draw smaller number of simulations, whatever wanted, collect results, repeat process until have done sufficient simulations. don't show n
is, suspect big, try smaller n
number of times give n
over-all.
Comments
Post a Comment