Definition of Markov process
: a stochastic process (such as Brownian motion) that resembles a Markov chain except that the states are continuous
also
: markov chain
— called also Markoff process
Test Your Vocabulary
Which Word Does Not Belong?
-
- Which of these words does not mean "nonsense"?

Test your visual vocabulary with our 10-question challenge!
TAKE THE QUIZ
A daily challenge for crossword fanatics.
TAKE THE QUIZLove words? Need even more definitions?
Merriam-Webster unabridged
Share Markov process
Time Traveler for Markov process
The first known use of Markov process was in 1938
See more words from the same year