I have an irreducible Markov Chain (zn)n∈N with state space X and with transition-probability-kernel K, so K(x,⋅) is a probability measure (on the σ-Algebra B(X)).
τB is the first entry time of the chain into the set B, ie τB:=min{i∈N|i≥1,zi∈B}.
A set B∈B(X) was defined as recurrent, if
Px{τB<∞}=1∀x∈X,
with Px being the law of the chain starting in x.
Now lets assume there is a recurrent set B and a set C so that for some α>0
K(x,B)≤αK(x,C)∀x∈X.
I am quite sure, that C has to be recurrent itself, because the chain would visit C once for every 1α-times it visits B. But how do I formally prove it?
Answer
α≥1 means the chance to hit C at any time is greater than the chance to hit B, so the chance to hit it in finite time will be greater or equal and we are done. In the case α<1 let's say the stopping time of the kth-entry to B is τB(k).
With the (strong) Markov property we get
Px{τB(k)<∞}=1∀k∈N1 which we shall call statement (I) and
Px{τB(k+1)−τB(k)<∞}=1∀k∈N1 which we call statement (II).
Now because of K(x,B)≤αK(x,C)∀x∈X,
we have
$ \mathbb{P}_x \{\tau_C > \tau_B (k) \}\leq \prod_{i=1}^{k} (1-\alpha\sum_{j=1}^{\infty} \mathbb{P}_x \{ z_{\tau_B (i)+j} \in B;\ z_{\tau_B (i)+m} \not\in B\ \forall m
as the sum is always equal to 1 (because of (II) ). One can understand it like this: each time B was hit (for a total of k times) there also was a chance to hit C, but C was dodged.
Taking k→∞ together with (I) gives Px{τC=∞}=0∀x∈X.
No comments:
Post a Comment