Monday, 3 May 2010

pr.probability - Non-existence of integral with respect to Poisson Random Measure

As I mentioned in my comment, you can prove the statement and its converse by looking at the moment generating function. Supposing that f ≥ 0 and λ > 0 is a real number, the following is true for a Poisson point measure ξ with Eξ = μ,



$$mathbb{E}left[e^{-lambdaxi f}right]=expleft(-muleft(1-e^{-lambda f}right)right).$$



You can calculate the probability that ξf is finite from this using monotone convergence,



$$mathbb{P}left(xi f lt inftyright)=lim_{lambdadownarrow 0}expleft(-muleft(1-e^{-lambda f}right)right).$$



If μ(f∧1) <∞ then (1-e-λf) ≤ f∧1 for all λ ≤ 1, so dominated convergence gives μ(1-e-λf) → 0 as λ →0. So, E[e-λξf] → 1. This gives ξf < ∞ almost surely.
Now, for the converse statement: If μ(f∧1) = ∞ then, using (1-e-λf) ≥ ½ λ(f∧1) for λ ≤ 1 shows that μ(1-e-λf) = ∞. So, E[e-λξf]=0, giving ξf = ∞ almost surely.



And, yes, the Kolmogorov-zero one law does indeed imply that ξf < ∞ with probability 0 or 1. Splitting the space up into a countable sequence of measurable sets Sn on which both μ and f are bounded, then we want to know if the sum ∑nξ(1Snf) of independent random variables is finite, which is a tail event.



Looking at my copy of Kallenberg I see that the statement you give, and I have just proven above, is Lemma 12.13. Precisely quoting his proof:




If ξ|f| < ∞ a.s. then μ(|f|∧1) < ∞ by Lemma 12.2. The converse implication was established in the proof of the same lemma.




Looking at his Lemma 12.2, it is the statement of the moment generating function which I just used. Like you say, it doesn't seem like he does establish the converse implication at all. However, it is a relatively simple step using my argument above.

No comments:

Post a Comment