Markov chain

From Conservapedia

Jump to: navigation, search

A Markov chain, sometimes called a Markov process, is a sequence in which any state in the sequence depends only on the previous state, and is independent of all other states. An example of a Markov process is the random walk problem.

Personal tools