markov-chain

  • 11Markov chain — /mahr kawf/, Statistics. a Markov process restricted to discrete random events or to discontinuous time sequences. Also, Markoff chain. [1940 45; see MARKOV PROCESS] * * * …

    Universalium

  • 12Markov chain — noun a Markov process for which the parameter is discrete time values • Syn: ↑Markoff chain • Hypernyms: ↑Markov process, ↑Markoff process …

    Useful english dictionary

  • 13Markov chain — noun Etymology: A. A. Markov died 1922 Russian mathematician Date: 1938 a usually discrete stochastic process (as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or… …

    New Collegiate Dictionary

  • 14Markov chain — noun A discrete time stochastic process with the Markov property …

    Wiktionary

  • 15Quantum Markov chain — In mathematics, the quantum Markov chain is a reformulation of the ideas of a classical Markov chain, replacing the classical definitions of probability with quantum probability. Very roughly, the theory of a quantum Markov chain resembles that… …

    Wikipedia

  • 16Lempel-Ziv-Markov chain algorithm — The Lempel Ziv Markov chain Algorithm (LZMA) is an algorithm used to perform data compression. It has been under development since 1998 [The SDK history file states that it was in development from 1996, and first used in 7 zip 2001 08 30. Aside… …

    Wikipedia

  • 17Markov decision process — Markov decision processes (MDPs), named after Andrey Markov, provide a mathematical framework for modeling decision making in situations where outcomes are partly random and partly under the control of a decision maker. MDPs are useful for… …

    Wikipedia

  • 18Markov — Markov, Markova, or Markoff are surnames and may refer to: In academia: Ivana Markova (born 1938), Czechoslovak British emeritus professor of psychology at the University of Stirling John Markoff (sociologist) (born 1942), American professor of… …

    Wikipedia

  • 19Markov process — Mark ov pro cess, n. [after A. A. Markov, Russian mathematician, b. 1856, d. 1922.] (Statistics) a random process in which the probabilities of states in a series depend only on the properties of the immediately preceding state or the next… …

    The Collaborative International Dictionary of English

  • 20Markov tree — may refer to A tree whose vertices correspond to Markov numbers A Markov chain This disambiguation page lists mathematics articles associated with the same title. If an internal link led you here, you may wi …

    Wikipedia