Assuming you mean Leonid Kovalev's interpretation, the distribution of the hitting time in the $N = 1$ case is the same as the distribution of the number of cycles of a random permutation of $[n]$.
To be more specific, I'll change coordinates. Let $X_0 = (x_0^1, ldots, x_0^N) = (S, S, ldots, S)$. Let $X_1 = (x_1^1, ldots, x_1^N)$, where $x_1^j$ is chosen uniformly at random from $0, 1, ldots, x_0^j-1$. Define $X_2$ similarly in terms of $X_1$, and so on.
Then the sequence $(x_0^1, x_1^1, x_2^1, x_3^1, ldots)$ are as follows:
- $x_0^1 = L$, of course.
- $x_1^1$ is uniformly distributed on ${ 0, ldots, S-1 }$.
- $x_2^1$ is uniformly distributed on ${ 0, ldots, x_1^1-1 }$.
and so on...
In particular the distribution of $x_1^1$ is the distribution of number of elements in a random permutation on $S$ elements which are {it not} in the cycle containing 1; In particular the distribution of $x_1^1$ is the distribution of number of elements in a random permutation on $S$ elements which are {it not} in any of the $k$ cycles with the smallest minima.
So the distribution of the smallest $k$ for which $x_k^1 = 0$ is exactly the distribution of the number of cycles of a random permutation of ${ 1, ldots, S }$; this is $1 + 1/2 + cdots + 1/S sim log S + gamma$, where $gamma = 0.577ldots$ is the Euler-Mascheroni constant.
In the higher-dimensional cases, the time to reach $(0, ldots, 0)$ is just the {it maximum} of the number of cycles of $N$ permutations chosen uniformly at random; this should still have expectation $log S$ plus a constant depending on $N$.
No comments:
Post a Comment