Sunday, 14 September 2008

nt.number theory - Slick proof related to choosing points from an interval in order

Choose a point anywhere in the unit interval $[0, 1]$. Now choose a second point from the same interval so that there is one point in each half, $[0, frac12]$ and $[frac12, 1]$. Now choose a third point so that there is one point in each third of the interval, and so on. How long can you choose points like this?



In other words, what is the largest $n$ so that there exists a (finite) sequence $(a_i)_{i=1}^n$ so that for all $k$, among the first $k$ points $(a_i)_{i=1}^k$, there is at least one in each interval $[0, frac1k], [frac1k, frac2k], ldots, [frac{k-1}k,1]$?



My question



Is there a slick proof that $n$ is bounded? (Note that by compactness, the existence of such a sequence for all $n$ is equivalent to the existence of an infinite sequence with the same property.)



Backstory



I was recently re-reading old columns of Martin Gardner's, and he describes this problem in "Bulgarian solitaire and other seemingly endless tasks". (Original column 1983, available in his collections "The last recreations: ..." and "The colossal book of mathematics".) He says that the problem first appeared in "One hundred problems in elementary mathematics" by Hugo Steinhaus. There, he shows that there is a sequence of length 14, but there is none of length 75. The first published proof of the longest sequence, which has length 17, was by Elwyn Berlekamp and Ron Graham in their paper "Irregularities in the distributions of finite sequences" in the Journal of Number Theory. This was followed by a shorter proof by Mieczysƚaw Warmus in "A supplementary note on the irregularities of distributions" in the same journal.



Now, these proofs mostly use case analysis to some degree or other. They are of varying complexities, oddly with Warmus's proof of the optimal $n$ being the shortest. It's also not too hard to write a computer check oneself that finds the optimal $n$. However, I feel that because of the elegant nature of the problem, there should be some "nice" proof -- not of optimality, but simply that such a sequence can't continue forever. Can anyone find one?



Technical note: The problem is usually stated with half-open intervals. I made them closed so that compactness worked. (Edit: The possibility that this changes the answer did occur to me. I assume it doesn't, and I'll check by computer soon. I am fine with answers for any kind of intervals -- open, closed, half-open.)

No comments:

Post a Comment