Markov chain

noun

Definition of Markov chain

: a usually discrete stochastic process (such as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved

called also Markoff chain

First Known Use of Markov chain

1938, in the meaning defined above

History and Etymology for Markov chain

A. A. Markov †1922 Russian mathematician

Learn More About Markov chain

Time Traveler for Markov chain

Time Traveler

The first known use of Markov chain was in 1938

See more words from the same year

Dictionary Entries Near Markov chain

Markova

Markov chain

Markovian

See More Nearby Entries 

Statistics for Markov chain

Cite this Entry

“Markov chain.” Merriam-Webster.com Dictionary, Merriam-Webster, https://www.merriam-webster.com/dictionary/Markov%20chain. Accessed 22 Jan. 2022.

Style: MLA
MLACheck Mark Icon ChicagoCheck Mark Icon APACheck Mark Icon Merriam-WebsterCheck Mark Icon

WORD OF THE DAY

Test Your Vocabulary

Name that Thing: Flower Edition

How Strong Is Your Vocabulary?

Test your vocabulary with our 10-question quiz!

TAKE THE QUIZ
Universal Daily Crossword

A daily challenge for crossword fanatics.

TAKE THE QUIZ
Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!