Saturday, 18 July 2009

pr.probability - Number of uniform rvs needed to cross a threshold

This problem has delighted me since I first encountered it before college. I wrote up a generalization for a less mathematical audience in a poker forum.



One way to look at the original is that if f(x) = E(N(X)), then $f(x) = 1 + int_{x-1}^x f(t) dt$, satisfying the initial condition that $f(x) = 0$ on $(-1,0)$. $f$ is 1 more than the average of $f$ on the previous interval.



We can get rid of the constant by using $g(x) = f(x) - 2x$ which satisfies $g(x) = int_{x-1}^x g(t) dt$. $g$ is equal to its average on the previous interval. Now the initial condition becomes nontrivial: $g(x) = -2x$ on $(-1,0)$, and it is from this $-2x$ that the asymptotic value of $2/3$ comes.



$g$ is asymptotically constant, and we can guess from geometry and verify that the weighted average
$int_0^1 2t~g(x+t)dt$ is independent of $x$, hence equal to the asymptotic value of $g$. The value at $x=-1$ is
$int_0^1 4t(1-t)dt = frac23$, so that is the asymptotic value of $g$.



The same technique works for more general continuous distributions supported on $mathbb R^+$ than the uniform distribution, and the answer turns out to be remarkably simple. Let's find an analogous conserved quantity.



Let $alpha$ be the density function. Let the $n$th moment be $alpha_n$. We'll want $alpha_2$ to exist.



Let $f(x)$ be the average index of the first partial sum which is at least $x$ of IID random variables with density $alpha$. $f(x) = 1 + int_{-infty}^x f(t) alpha(x-t) dt$, and $f(x) = 0$ for $xlt0$. (We could let the lower limit be 0, too.)



Let $g(x) = f(x) - x/alpha_1$. Then $g(x) = int_{-infty}^xg(t)alpha(x-t)dt$, with $g(x) = -x/alpha_1$ on $mathbb R^-$.



We'll choose $h$ so that $H(x) = int_{-infty}^x g(t) h(x-t) dt$ is constant.



$0 = H'(x) = g(x)h(0) + int_{infty}^x g(t) h'(x-t)dt.$



This integral looks like the integral equation for $g$ if we choose $h'(x) = calpha(x)$. $c=-1$ satisfies the equation. So, if $h(x) = int_x^infty alpha(t)dt$ then $H(x)$ is constant.



Let the asymptotic value of $g$ be $v$. Then the value of $H(x)$ is both $H(infty) = v~alpha_1$ and



$H(0) = int_{-infty}^0 g(t) h(0-t)dt$



$H(0) = 1/alpha_1 int_0^infty y~h(y) dy$



$H(0) = 1/(2alpha_1) int_0^infty y^2 ~alpha(y)dy~~$ (by parts)



$H(0) = alpha_2 / (2 alpha_1)$



$v~ alpha_1 = alpha_2 / (2 alpha_1)$



$v = alpha_2 / (2 alpha_1^2)$.



So, $f(x)$ is asymptotic to $x/alpha_1 + alpha_2/(2 alpha_1^2)$.



For the original uniform distribution on [0,1], $alpha_1 = frac12$ and $alpha_2 = frac13$, so $f(x) = 2x + frac23 + o(1)$.



As a check, an exponential distribution with mean 1 has second moment 2, and we get that $f(x)$ is asymptotic to $x+1$. In fact, in that case, $f(x) = x+1$ on $mathbb R^+$. If you have memoryless light bulbs with average life 1, then at time $x$, an average of $x$ bulbs have burned out, and you are on the $x+1$st bulb.

No comments:

Post a Comment