Karim BELABAS on Tue, 14 Oct 2003 17:53:59 +0200 (MEST)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: polredabs failure


On Mon, 13 Oct 2003, Igor Schein wrote:
> I don't know if it's too early to talk about efficiency yet, but for
> the following polynomial
>
> {
> x^32 + 160*x^30 + 10592*x^28 + 380240*x^26 + 8223880*x^24 + 113613696*x^22 +
>  1038843456*x^20 + 6414739680*x^18 + 26951021784*x^16 + 76644143232*x^14 + 1
> 44482787456*x^12 + 172572281280*x^10 + 118940935456*x^8 + 37693530112*x^6 +
> 1869026048*x^4 + 31222400*x^2 + 169744
> }
>
> polredabs has slowed down considerable - used to take only 8s, now it
> takes 40s if I start with at least 8MB stack, and much longer if I
> start with 32bit-default 4MB stack - operating on a low stack with a
> lot of GC?

You can't have everything. This example is not very surprising (many
subfields, and quadratic ones at that) : one should be thankful it works
at all !!

Also guaranteeing that all minimal vectors are kept has a high memory price.
Don't use polredabs on large examples with default stack.

On some examples, the new method has to slow down things. Picture this as a
probabilistic algorithm: we do a random walk inside some ellipsoid, hoping to
quickly find small generators. My new heuristic tries to ensure that elements
are checked by roughly increasing norm [ avoiding some of the subspaces
generating strict subfields, due to some (expensive) preprocessing step ].

 In this particular example, the "core" is entirely filled with
non-generators. Which could be detected in the subfields search phase, but at
an even higher cost [ due to the special nature of the Galois resolvants
you're throwing at it, polynomial factorization is highly non-trivial. ].
On the other hand, there's a small generator on the "boundary", which for
some reason was chanced upon early by the old heuristic.

I can do both kind of searches "simultaneously" [ the code is already far
beyond the point of unreadability, it can be objuscated further without
adverse effect ]. But in most cases, it will slow down things by a factor 2,
a priori to no avail.

In any case, I do not call this considerable. Considerable = from easy
(< 1min) to almost unfeasible (> 1 day).

Again, you're playing against an exponential time algorithm. In huge
dimension, you're bound to lose in some cases. (n = 32 is huge, and your
input is an almost guaranteed worst case).

I'm much more interested in remaining unfeasible examples. Esp. decent ones
without too many subfields. Not Swinnerton-Dyer polynomials.

Cheers,

    Karim.
-- 
Karim Belabas                     Tel: (+33) (0)1 69 15 57 48
Dép. de Mathématiques, Bât. 425   Fax: (+33) (0)1 69 15 60 19
Université Paris-Sud              http://www.math.u-psud.fr/~belabas/
F-91405 Orsay (France)            http://www.parigp-home.de/  [PARI/GP]