Markov chain

noun

Definition of Markov chain

:a usually discrete stochastic process (such as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved called also Markoff chain

Origin and Etymology of markov chain

A. A. Markov †1922 Russian mathematician


Learn More about markov chain


Seen and Heard

What made you want to look up Markov chain? Please tell us where you read or heard it (including the quote, if possible).

Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!

WORD OF THE DAY

a strongly worded attack

Get Word of the Day daily email!

Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!