
A Markov chain (discrete-time Markov chain or DTMC), named after Andrey Markov, is a mathematical system that undergoes transitions from one state to another on a state space. It is a random process usually characterized as memoryless: the next state depends only on the current state and not on the sequence of events that preceded it. This specifi...
Found on
http://en.wikipedia.org/wiki/Markov_chain

(from the article `Markov, Andrey Andreyevich`) Russian mathematician who helped to develop the theory of stochastic processes, especially those called Markov chains. Based on the study of the ...
Found on
http://www.britannica.com/eb/a-z/m/38

A sequence of random variables in which the future variable is determined by the present variable but is independent of the way in which the present state arose from its predecessors. In other words, a Markov chain describes a chance process in which the future state can be predicted from its presen...
Found on
http://www.daviddarling.info/encyclopedia/M/Markov_chain.html

A stochastic process is a Markov chain if: (1) time is discrete, meaning that the time index t has a finite or countably infinite number of values; (2) the set of possible values of the process at each time is finite or countably infinite; and (3) it has the Markov property of memorylessness. Source: Hoel, Port, and Stone, 1987, pg v and pg 1 Conte...
Found on
http://www.econterms.com/glossary.cgi?query=Markov+chain

A stochastic model having discrete states in which the probability of being in any state at any time depends only on the state at the previous time and on the probability transition matrix.
Found on
http://www.encyclo.co.uk/visitor-contributions.php

[
n] - a Markov process for which the parameter is discrete time values
Found on
http://www.webdictionary.co.uk/definition.php?query=Markov%20chain
Markoff chain noun a Markov process for which the parameter is discrete time values
Found on
https://www.encyclo.co.uk/local/20974
No exact match found.