The double exponential bound is indeed frighteningly easy to obtain. This ought to make Grobner bases prohibitively costly, and indeed there has been some effort put into non-Grobner methods to solve polynomial equations (e.g. Gregoire Lecerf's Kronecker package for Magma).
In practice, Grobner bases remain competitive, and users of Grobner methods note that the double-exponential bound, though easily obtained through construction, does not occur often in the systems that people are actually interested in. In particular, if you're working with a 0-dimensional ideal, it is possible to construct the Grobner basis of the system (wrt any admissible ordering) in single-exponential time , which certainly implies that the basis itself is single exponential in size. [Y.N. Lakshman, A single exponential bound on the complexity of computing Gröbner bases of zero-dimensional ideals. In: Effective methods in algebraic geometry (Castiglioncello, 1990).]
Sparsity does not help. As I recall, Mayr and Meyer-type examples are extremely sparse.
I've never heard of any serious probabilistic study of the number of elements, probably because this would be excessively hard and not very rewarding, given that there is no natural way to put a probability measure on your system. (Note that for a somewhat related problem, the average number of real roots of a real polynomial, the answers you obtain do depend noticeably on the measure used.)
No comments:
Post a Comment