Okay, that's a misleading title. This is a somewhat subtler problem than undergraduate linear algebra, although I suspect there's still an easy answer. But I couldn't resist :D.
Here's the actual problem: We're given a black-box linear transformation from $V rightarrow W$, where $V, W$ are vector spaces of dimensions m, n respectively (say m < n), and we want to know if it has full rank. (Numerical considerations aren't an issue; if you want, say it's over a finite field.) This is easy to do in time $O(m^2n)$ and with m calls to the black-box function, just by computing the image of a basis in m and using Gaussian elimination. It's also immediately obvious that we can't do better than m calls to the function in a deterministic algorithm, and I'm pretty sure but haven't quite managed to prove that you can't beat Gaussian elimination asymptotically either.
But can we do better if we just want a probabilistic algorithm? If we're allowed to make as many function calls as we want? What's the best lower bound we can get, probabilistically? These are probably pretty trivial questions (since everything's linear-algebraic and nice), but I just don't know how to approach them.
No comments:
Post a Comment