Jeroen Demeyer on Tue, 20 Jan 2015 22:54:15 +0100


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: Stack size bugs and parisizemax


On 2015-01-20 21:32, Bill Allombert wrote:
The purpose of parisizemax was to allow to have a nearly unlimited stack
(limited only by the available hardware) without increasing the actual
memory usage (in the sense that the final stack size will be close to the
minimum stack size for which the computation would have succeed, up to a factor
of two).
I don't think that parisizemax should behave like that. If an algorithm can work significantly faster by using a larger stack size than the minimum, it should do that.

I assume you would still like to call gerepileall() when parisize reaches
parisizemax, otherwise you will get a stack overflow.
Of course. In my proposal, if parisize == parisizemax, then the current behaviour remains. If parisize < parisizemax, then the number of GC's is limited.

Your proposed change would cause the stack to fill up quickly to parisizemax
No, the stack size would still be bounded independent of parisizemax.

I implemented my proposal for determinant (PARI bug #1655) in Sage:
http://git.sagemath.org/sage.git/tree/build/pkgs/pari/patches/det_garbage.patch?id=5cd1f002e107df3d7ceadf71e89188f6776676af
This works perfectly: it doesn't cause an explosion of the stack Sage and it doesn't cause an unwanted slowdown. This patch was written for the 2.7 stack model, so it doesn't take into account parisizemax.


Cheers,
Jeroen.