Harris chain

from Wikipedia, the free encyclopedia

A Harris chain , named after the mathematician Theodore E. Harris , is a special Markov chain in discrete time on a measurable state space. Harris chains are interesting because you can formulate ergodic sentences for them.

definition

Be a measurable space . Let be a Markov chain on the state space with a transition kernel . Then the Harris chain is called if there are sets , a and a probability measure on with such that:

  1. And applies to everyone
  2. for all and all measurable applies

In this case referred to the first time of entry of the chain in the amount .

Individual evidence

  1. Rick Durret: Probability: Theory and Examples. 4th edition. Cambridge University Press, 2010, ISBN 978-0-521-76539-8 , Section 6.8, p. 318ff ( limited preview in Google Book Search).