# Transience and Recurrence

The states of a Markov chain can be classified as either transient or recurrent. A transient state is one that the chain may only visit finitely many times; a recurrent state is one that the chain is guaranteed to visit infinitely many times.

## Transience and Recurrence

### Markov Chains

# Transience and Recurrence

Biologists can model a population's genotype distribution across generations with a Markov chain. Suppose there are two individuals with genotype Aa, and in each successive generation, two individuals are selected from the (numerous) offspring of the previous generation. These pairs of individuals form the possible states: AA and AA; AA and Aa; Aa and Aa; AA and aa; Aa and aa; aa and aa. Note that the first and last states listed are absorbing states. If the process reaches the state Aa and Aa, and two independent choices are made from the offspring population, determine the probability that the process reaches the state AA and AA in the very next step.

Recall that a child receives one allele (letter) from each parent, with a $$50\%$$ probability of either allele being selected; for instance, the parents with genotype Aa will have a child with genotype AA with probability $$25\%$$, Aa with probability $$50\%$$ and aa with probability $$25\%$$.

## Transience and Recurrence

### Markov Chains

# Transience and Recurrence

The recurrent states of a Markov chain can be further subdivided into positive recurrent and null recurrent states. A state is positive recurrent if the expected time taken for a chain starting from that state to return to it is finite, and a state is null recurrent if that same expectation is infinite.

## Transience and Recurrence

### Markov Chains

# Transience and Recurrence

Which of the following are true?

(1) Suppose state $$i$$ is null recurrent. The expected number of returns to state $$i$$ is infinite.
(2) Suppose state $$i$$ is positive recurrent. The expected number of returns to state $$i$$ is finite.
(3) Suppose state $$i$$ is transient. The expected number of steps needed to return to state $$i$$ is not finite.

## Transience and Recurrence

### Markov Chains

# Transience and Recurrence

A Markov chain has states that are the natural numbers $$1, \, 2, 3, \ldots$$ and the transition probabilities at state $$n > 1$$ are probability $$\tfrac{1}{n}$$ to go to $$n+1$$, probability $$\tfrac{1}{n}$$ to go to $$n - 1$$ and probability $$1 - \tfrac{2}{n}$$ to remain at $$n$$ (state $$1$$ simply has probability $$1$$ of going to state $$2$$).

Are any states periodic?
Are any states transient?

## Transience and Recurrence

### Markov Chains

# Transience and Recurrence

A very basic question about Markov chains is the expected number of steps they will take to get from one state to another. These questions can be answered by looking at the fundamental matrix, which is formed by deleting the rows and columns corresponding to absorbing states. If the fundamental matrix is $$M,$$ the matrix $$(I-M)^{-1}$$ contains in row $$i$$ and column $$j$$ the expected number of visits to state $$j$$ before being absorbed, starting from state $$i.$$ The total number of steps expected, then, is the sum of the entries in row $$i,$$ and this calculation can be done for any state by tweaking the chain to make that state absorbing.

## Transience and Recurrence

### Markov Chains

# Transience and Recurrence

If a process starts at state A, what is the expected number of steps until it reaches state D?

## Transience and Recurrence

### Markov Chains

# Transience and Recurrence

The question of how long it takes to return to a state, starting from that state, is much easier: the answer is just the reciprocal of the corresponding element of the stationary distribution.

## Transience and Recurrence

### Markov Chains

# Transience and Recurrence

If a process starts at state A, what is the expected number of steps until it returns to state A?

## Transience and Recurrence

### Markov Chains

×