Markoff Chain
A Markov chain is collection of random variables {X_t} (where the index t runs through 0, 1, ...) having the property that, given the present, the future is conditionally independent of the past. In other words, P(X_t = j|X_0 = i_0, X_1 = i_1, ..., X_(t - 1) = i_(t - 1)) = P(X_t = j|X_(t - 1) = i_(t - 1)). If a Markov sequence of random variates X_n take the discrete values a_1, ..., a_N, then P(x_n = a_(i_n)|x_(n - 1) = a_(i_(n - 1)), ..., x_1 = a_(i_1)) = P(x_n = a_(i_n)|x_(n - 1) = a_(i_(n - 1))), and the sequence x_n is called a Markov chain. A simple random walk is an example of a Markov chain.