sample_MegaLMM
estimate_memory_posterior.Rd
A call to sample_MegaLMM(MegaLMM_state,n_iter)
will run n_iter
of the Gibbs sampler. If nrun > burn
, then a posterior sample of all variables
stored in MegaLMM_state$Posterior
every thin
iteration. If you are doing
a long run, and storing a large number of parameters, this will take a lot of memory.
This function will estimate the memory requirements.
estimate_memory_posterior(MegaLMM_state, n_iter)
The model after calling clear_Posterior
number of iterations of the Gibbs sampler
The estimated memory size in bytes
Note 1: The estimated value will assume all iterations are post-burnin
Note 2: sample_MegaLMM()
will instantiate all arrays to hold the posterior samples
prior to running the iterations, so memory requirements will not increase much during the sampling.
Note 3: It is generally not needed to run sample_MegaLMM(MegaLMM_state,n_iter)
with a large n_iter
. Instead, run the function many times, each with a small n_iter
,
calling save_posterior_chunk
between each run. This gives you the ability
to diagnose problems during the run, and keeps the memory requirments low. You can always
reload the posterior samples from the database on the disk using reload_Posterior
or
load_posterior_param
.
estimate_memory_posterior(MegaLMM_state,100)
#> Error in estimate_memory_posterior(MegaLMM_state, 100): object 'MegaLMM_state' not found