Markov chain


Markov chain

noun

Definition of MARKOV CHAIN

:  a usually discrete stochastic process (as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved —called also Markoff chain

Origin of MARKOV CHAIN

A. A. Markov †1922 Russian mathematician
First Known Use: 1938

Browse

Next Word in the Dictionary: Markovian
Previous Word in the Dictionary: markka
All Words Near: Markov chain

Seen & Heard

What made you want to look up Markov chain? Please tell us where you read or heard it (including the quote, if possible).

Get Our Free Apps
Voice Search, Favorites,
Word of the Day, and More
Join Us on FB & Twitter
Get the Word of the Day and More