Markov process

noun

: a stochastic process (such as Brownian motion) that resembles a Markov chain except that the states are continuous
also : markov chain

called also Markoff process

Examples of Markov process in a Sentence

Recent Examples on the Web The layers mirrored what often happens in a mathematical system known as a Markov process, says Hovden. Leila Sloman, Popular Mechanics, 28 Feb. 2022 This pattern is typical of systems involving a Markov process. Leila Sloman, Popular Mechanics, 28 Feb. 2022

These examples are programmatically compiled from various online sources to illustrate current usage of the word 'Markov process.' Any opinions expressed in the examples do not represent those of Merriam-Webster or its editors. Send us feedback about these examples.

Word History

First Known Use

1938, in the meaning defined above

Time Traveler
The first known use of Markov process was in 1938

Dictionary Entries Near Markov process

Cite this Entry

“Markov process.” Merriam-Webster.com Dictionary, Merriam-Webster, https://www.merriam-webster.com/dictionary/Markov%20process. Accessed 14 Apr. 2024.

More from Merriam-Webster on Markov process

Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!