I am trying to solve a square polynomial system (16 x 16, I was able to solve an 8 x 8 fine) using the package RegularChains (with Groebner bases the problem is the same) but I have a memory constraint of 4G in the Linux cluster and my batch job gets terminated after Maple grabs too much memory:
TERM_MEMLIMIT; (job*killed*after*reaching*LSF*memory*usage*limit.Exited, with, true)*(File*size*limit*exceeded.Resource, usage, true)*summary; CPU*time;(4509.00*sec.Max, Memory, true); 6*GB*Max*Swap; 47*GB*Max*Processes; 4*Max*Threads; 10;
My question is: If I impose memory constraints to Maple (say setting datalimit or filelimit to some value) will Maple take more time to perform the computations (which is fine for me, I can wait a month or more) or instead of the processing being terminated by the server, it will be terminated by Maple itself? In another words, is there a way of trading memory for time? If this is possible, any references?
I read several questions on memory problems but what everyone is asking is how to increase the memory, this option is not available to me right kow but on the other hand the time limits on how long I can run processes are relatively lax.