Markov chain mixing time definition

Search

Markov chain mixing time

Markov chain mixing time logo #21000 In probability theory, the mixing time of a Markov chain is the time until the Markov chain is `close` to its steady state distribution. == See also== ...
Found on http://en.wikipedia.org/wiki/Markov_chain_mixing_time
No exact match found.