Assuming you mean Leonid Kovalev's interpretation, the distribution of the hitting time in the N=1 case is the same as the distribution of the number of cycles of a random permutation of [n].
To be more specific, I'll change coordinates. Let X0=(x10,ldots,xN0)=(S,S,ldots,S). Let X1=(x11,ldots,xN1), where xj1 is chosen uniformly at random from 0,1,ldots,xj0−1. Define X2 similarly in terms of X1, and so on.
Then the sequence (x10,x11,x12,x13,ldots) are as follows:
- x10=L, of course.
- x11 is uniformly distributed on 0,ldots,S−1.
- x12 is uniformly distributed on 0,ldots,x11−1.
and so on...
In particular the distribution of x11 is the distribution of number of elements in a random permutation on S elements which are {it not} in the cycle containing 1; In particular the distribution of x11 is the distribution of number of elements in a random permutation on S elements which are {it not} in any of the k cycles with the smallest minima.
So the distribution of the smallest k for which x1k=0 is exactly the distribution of the number of cycles of a random permutation of 1,ldots,S; this is 1+1/2+cdots+1/SsimlogS+gamma, where gamma=0.577ldots is the Euler-Mascheroni constant.
In the higher-dimensional cases, the time to reach (0,ldots,0) is just the {it maximum} of the number of cycles of N permutations chosen uniformly at random; this should still have expectation logS plus a constant depending on N.
No comments:
Post a Comment