Markov chain

noun

Definition of Markov chain

: a usually discrete stochastic process (such as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved

called also Markoff chain

First Known Use of Markov chain

1938, in the meaning defined above

History and Etymology for Markov chain

A. A. Markov †1922 Russian mathematician

Keep scrolling for more

Learn More about Markov chain

Time Traveler for Markov chain

Time Traveler

The first known use of Markov chain was in 1938

See more words from the same year

Statistics for Markov chain

Cite this Entry

“Markov chain.” Merriam-Webster.com Dictionary, Merriam-Webster, https://www.merriam-webster.com/dictionary/Markov%20chain. Accessed 2 Dec. 2020.

Comments on Markov chain

What made you want to look up Markov chain? Please tell us where you read or heard it (including the quote, if possible).

WORD OF THE DAY

Test Your Vocabulary

Musical Words Quiz

  • gramophone
  • Which word describes a musical performance marked by the absence of instrumental accompaniment?
Spell It

Can you spell these 10 commonly misspelled words?

TAKE THE QUIZ
Dictionary Devil

Test Your Knowledge - and learn some interesting things along the way.

TAKE THE QUIZ
Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!