Markov process

noun

Definition of Markov process

: a stochastic process (such as Brownian motion) that resembles a Markov chain except that the states are continuous also : markov chain

called also Markoff process

First Known Use of Markov process

1938, in the meaning defined above

Keep scrolling for more

Learn More about Markov process

Time Traveler for Markov process

Time Traveler

The first known use of Markov process was in 1938

See more words from the same year

Statistics for Markov process

Cite this Entry

“Markov process.” Merriam-Webster.com Dictionary, Merriam-Webster, https://www.merriam-webster.com/dictionary/Markov%20process. Accessed 14 Apr. 2021.

Style: MLA
MLACheck Mark Icon ChicagoCheck Mark Icon APACheck Mark Icon Merriam-WebsterCheck Mark Icon

Comments on Markov process

What made you want to look up Markov process? Please tell us where you read or heard it (including the quote, if possible).

WORD OF THE DAY

Test Your Vocabulary

The Exceptions Quiz III

  • one green toy robot amidst many red toy robots
  • Which of these words does not mean "nonsense"?
Spell It

Can you spell these 10 commonly misspelled words?

TAKE THE QUIZ
 AlphaBear 2

Spell words. Make bears.

TAKE THE QUIZ
Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!