Long story short, running a loop that takes random variables and does various statistical things to them.
The loop has to run about 10^7 times with the final value going into an Array but I cant do that because the Statistical stuff such as CumulativeDistributionFunction takes up so much memory that I end up losing the kernel.
But... Today I came across the gc() command. I put this in at the end of my loop (before the end do), and lo and behold I was using virtually zero memory for the loop. Only what was needed to fill an Array.
Problem is thought using gc() in every loop makes the procedure take 5x as long so I was wondering if there would be a way of puting an ...'if k mod 1000000 = 0 then gc()... So that it garbage collects every 1000000 runs as opposed to every time...