Last updated: 2019-03-31

Checks: 6 0

Knit directory: fiveMinuteStats/analysis/

This reproducible R Markdown analysis was created with workflowr (version 1.2.0). The Report tab describes the reproducibility checks that were applied when the results were created. The Past versions tab lists the development history.


Great! Since the R Markdown file has been committed to the Git repository, you know the exact version of the code that produced these results.

Great job! The global environment was empty. Objects defined in the global environment can affect the analysis in your R Markdown file in unknown ways. For reproduciblity it’s best to always run the code in an empty environment.

The command set.seed(12345) was run prior to running the code in the R Markdown file. Setting a seed ensures that any results that rely on randomness, e.g. subsampling or permutations, are reproducible.

Great job! Recording the operating system, R version, and package versions is critical for reproducibility.

Nice! There were no cached chunks for this analysis, so you can be confident that you successfully produced the results during this run.

Great! You are using Git for version control. Tracking code development and connecting the code version to the results is critical for reproducibility. The version displayed above was the version of the Git repository at the time these results were generated.

Note that you need to be careful to ensure that all relevant files for the analysis have been committed to Git prior to generating the results (you can use wflow_publish or wflow_git_commit). workflowr only checks the R Markdown file, but you know if there are other scripts or data files that it depends on. Below is the status of the Git repository when the results were generated:


Ignored files:
    Ignored:    .Rhistory
    Ignored:    .Rproj.user/
    Ignored:    analysis/.Rhistory
    Ignored:    analysis/bernoulli_poisson_process_cache/

Untracked files:
    Untracked:  _workflowr.yml
    Untracked:  analysis/CI.Rmd
    Untracked:  analysis/gibbs_structure.Rmd
    Untracked:  analysis/libs/
    Untracked:  analysis/results.Rmd
    Untracked:  analysis/shiny/tester/
    Untracked:  docs/MH_intro_files/
    Untracked:  docs/citations.bib
    Untracked:  docs/figure/MH_intro.Rmd/
    Untracked:  docs/figure/hmm.Rmd/
    Untracked:  docs/hmm_files/
    Untracked:  docs/libs/
    Untracked:  docs/shiny/tester/

Note that any generated files, e.g. HTML, png, CSS, etc., are not included in this status report because it is ok for generated content to have uncommitted changes.


These are the previous versions of the R Markdown and HTML files. If you’ve configured a remote Git repository (see ?wflow_git_remote), click on the hyperlinks in the table below to view them.

File Version Author Date Message
html 34bcc51 John Blischak 2017-03-06 Build site.
Rmd 5fbc8b5 John Blischak 2017-03-06 Update workflowr project with wflow_update (version 0.4.0).
Rmd 391ba3c John Blischak 2017-03-06 Remove front and end matter of non-standard templates.
html fb0f6e3 stephens999 2017-03-03 Merge pull request #33 from mdavy86/f/review
html c3b365a John Blischak 2017-01-02 Build site.
Rmd 67a8575 John Blischak 2017-01-02 Use external chunk to set knitr chunk options.
Rmd 5ec12c7 John Blischak 2017-01-02 Use session-info chunk.
Rmd 3bb3b73 mbonakda 2016-02-24 add two mixture model vignettes + merge redundant info in markov chain vignettes
Rmd d7cd231 mbonakda 2016-02-06 cleanup typos + compile html
Rmd 8f83337 jnovembre 2016-02-04 Addiing poisson_process with time dept thinning example, and move lin eq sectio in markov_chains_discrete_stationary
Rmd c8c677b jnovembre 2016-01-31 Polish eigen results
Rmd bb814ef jnovembre 2016-01-31 Initial commit

Pre-requisites

This vignette builds on the Introduction to Discrete Markov chains vignette. It assumes an understanding of matrix multiplication, matrix powers, and eigendecomposition. We also do not explain the notion of an ergodic Markov chain (but we hope to add a vignette on this soon!).

Overview

The stationary distribution of a Markov chain is an important feature of the chain. One of the ways is using an eigendecomposition. The eigendecomposition is also useful because it suggests how we can quickly compute matrix powers like \(P^n\) and how we can assess the rate of convergence to a stationary distribution.

Stationary distribution of a Markov Chain

As part of the definition of a Markov chain, there is some probability distribution on the states at time \(0\). Each time step the distribution on states evolves - some states may become more likely and others less likely and this is dictated by \(P\). The stationary distribution of a Markov chain describes the distribution of \(X_t\) after a sufficiently long time that the distribution of \(X_t\) does not change any longer. To put this notion in equation form, let \(\pi\) be a column vector of probabilities on the states that a Markov chain can visit. Then, \(\pi\) is the stationary distribution if it has the property \[\pi^T= \pi^T P.\]

Not all Morkov chains have a stationary distribution but for some classes of probability transition matrix (those defining ergodic Markov chains), a stationary distribution is guaranteed to exist.

Example: Gary’s mood

In Sheldon Ross’s Introduction to Probability Models, he has an example (4.3) of a Markov Chain for modeling Gary’s mood. Gary alternates between 3 state: Cheery (\(X=1\)), So-So (\(X=2\)), or Glum (\(X=3\)). Here we input the \(P\) matrix given by Ross and we input an abitrary initial probability matrix.

# Define prob transition matrix 
# (note matrix() takes vectors in column form so there is a transpose here to switch col's to row's)
P=t(matrix(c(c(0.5,0.4,0.1),c(0.3,0.4,0.3),c(0.2,0.3,0.5)),nrow=3))
# Check sum across = 1
apply(P,1,sum)  
[1] 1 1 1
# Definte initial probability vector
x0=c(0.1,0.2,0.7)
# Check sums to 1
sum(x0)
[1] 1

Solving for stationary distributions

The stationary distribution has the property \(\pi^T= \pi^T P\)

Brute-force solution

A brute-force hack to finding the stationary distribution is simply to take the transition matrix to a high power and then extract out any row.

pi_bru <- (P^100)[1,]
pi_bru
[1]  7.888609e-31  1.606938e-40 1.000000e-100

We can test if the resulting vector is a stationary distribution by assessing if the resulting vector statisfies \(\pi^{T}=pi^{T}P\) (i.e. \(pi^{T}-pi^{T}P - = 0\)).

pi_bru - pi_bru%*%P
             [,1]          [,2]          [,3]
[1,] 3.944305e-31 -3.155444e-31 -7.888609e-32

As we can see up to some very small errors, for this example, our numerical solution checks out.

Solving via eigendecomposition

Note that the equation \(\pi^T P=\pi^T\) implies that the vector \(\pi\) is a left eigenvector of P with eigenvalue equal to 1 (Recall \(xA=\lambda x\) where \(x\) is a row vector is definition of a left eigenvector, as opposed to the more standard right eigenvector \(Ax=\lambda x\)). In what follows, we use eigenvector functions in R to extract out the solution.

library(MASS)
# Get the eigenvectors of P, note: R returns right eigenvectors
r=eigen(P)
rvec=r$vectors
# left eigenvectors are the inverse of the right eigenvectors
lvec=ginv(r$vectors)
# The eigenvalues
lam<-r$values
# Two ways of checking the spectral decomposition:
## Standard definition
rvec%*%diag(lam)%*%ginv(rvec)
     [,1] [,2] [,3]
[1,]  0.5  0.4  0.1
[2,]  0.3  0.4  0.3
[3,]  0.2  0.3  0.5
## With left eigenvectors (trivial chang)
rvec%*%diag(lam)%*%lvec
     [,1] [,2] [,3]
[1,]  0.5  0.4  0.1
[2,]  0.3  0.4  0.3
[3,]  0.2  0.3  0.5
lam 
[1] 1.00000000 0.34142136 0.05857864

We see the first eigenvalue is \(1\) and so the first left eigenvector, suitably normalized, should contain the stationary distribution:

pi_eig<-lvec[1,]/sum(lvec[1,])
pi_eig
[1] 0.3387097 0.3709677 0.2903226
sum(pi_eig)
[1] 1
pi_eig %*% P
          [,1]      [,2]      [,3]
[1,] 0.3387097 0.3709677 0.2903226

And we see the procedure checks out.

As a side-note: We can also obtain the left eigenvectors as the transposes of the right eigenvectors of t(P)

r<-eigen(t(P))
V<-r$vectors
lam<-r$values
V%*%diag(lam)%*%ginv(V)
     [,1] [,2] [,3]
[1,]  0.5  0.3  0.2
[2,]  0.4  0.4  0.3
[3,]  0.1  0.3  0.5
# Note how we are pulling columns here. 
pi_eig2 <- V[,1]/sum(V[,1])

Rate of approach to the stationary distribution

The size of the first non-unit eigenvalue (\(\lambda_2\)) indicates the rate of approach to equilibrium because it describes how quickly the largest of the vanishing terms (i.e. those with \(\lambda_i<1\)) will approach zero.

This is easiest seen by recalling the eigendecomposition of \(P^n\) can be written as \[P^n\sum_i \lambda_i^n r_i l_i^T\], where \(r_i\), \(l_i\), and \(\lambda_i\) are right eigenvectors, left eigenvectors, and eigenvalues of the matrix \(P\), respectively. So, when \(\lambda_2^n\) approaches 0, the only terms left in the eigendecomposition will be the terms corresponding to the first eigenvalue - i.e. the stationary distribution! As a rough rule of thumb for approximation, taking a number \(x\) less than 1 to the \(n\)’th power will approach 0 if \(n\) is larger than some small multiple of \(1/x\) time-steps (e.g if n > 4/x).

For our example, \(1/\lambda_2\) is approximately 3 generations.

1/lam[2]
[1] 2.928932

Which implies we will reach equilibrium fairly quickly - much more quickly than the 100 generations we were using for our brute-force soluton to the stationary distribution. As a test, let’s see how \(P^12\) (i.e approx \(4/\lambda_2\)) looks:

P^12
             [,1]         [,2]         [,3]
[1,] 2.441406e-04 1.677722e-05 1.000000e-12
[2,] 5.314410e-07 1.677722e-05 5.314410e-07
[3,] 4.096000e-09 5.314410e-07 2.441406e-04

Indeed - Gary’s mood will return to its stationary distribution relatively quickly after any perturbation!

A side-note: Computational advantage of using an eigendecomposition for matrix powers

Thanks to the eigenvector decomposition, to obtain the matrix power \(P^n\) we just need to take the powers of the eigenvalues. Compare the following lines of code to \(P\),\(P^2\), \(P^100\) computed above. And note - this is much faster than naively doing the matrix multipliation over and over to obtain the powers.

rvec%*%diag(lam)%*%lvec
     [,1] [,2] [,3]
[1,]  0.5  0.4  0.1
[2,]  0.3  0.4  0.3
[3,]  0.2  0.3  0.5
rvec%*%diag(lam^2)%*%lvec
     [,1] [,2] [,3]
[1,] 0.39 0.39 0.22
[2,] 0.33 0.37 0.30
[3,] 0.29 0.35 0.36
rvec%*%diag(lam^100)%*%lvec
          [,1]      [,2]      [,3]
[1,] 0.3387097 0.3709677 0.2903226
[2,] 0.3387097 0.3709677 0.2903226
[3,] 0.3387097 0.3709677 0.2903226

Miscellaneous : Solving a system of linear equations solution

Another approach is to solve the system of linear equations \(\pi^{T}=\pi^{T}P\). These equations are known as the global balance equations, and this approach is introduced in Discrete Markov Chains: Finding the Stationary Distribution via solution of the global balance equations. We include it here for comparison to the eigendecomposition approach on the same example.

K<-3
A_basic <- t(diag(rep(1,K))-P)
b_basic <- rep(0,K)

# Now add the constraint 
A_constr <- rbind(A_basic,rep(1,K))
b_constr <- c(b_basic,1)

pi_lineq <- t(solve(t(A_constr)%*%A_constr,t(A_constr)%*%b_constr))
pi_lineq%*%P
          [,1]      [,2]      [,3]
[1,] 0.3387097 0.3709677 0.2903226

And the solution checks out!



sessionInfo()
R version 3.5.2 (2018-12-20)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS Mojave 10.14.1

Matrix products: default
BLAS: /Library/Frameworks/R.framework/Versions/3.5/Resources/lib/libRblas.0.dylib
LAPACK: /Library/Frameworks/R.framework/Versions/3.5/Resources/lib/libRlapack.dylib

locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
[1] MASS_7.3-51.1

loaded via a namespace (and not attached):
 [1] workflowr_1.2.0 Rcpp_1.0.0      digest_0.6.18   rprojroot_1.3-2
 [5] backports_1.1.3 git2r_0.24.0    magrittr_1.5    evaluate_0.12  
 [9] stringi_1.2.4   fs_1.2.6        whisker_0.3-2   rmarkdown_1.11 
[13] tools_3.5.2     stringr_1.3.1   glue_1.3.0      xfun_0.4       
[17] yaml_2.2.0      compiler_3.5.2  htmltools_0.3.6 knitr_1.21     

This site was created with R Markdown