Dictionary

Markov process

noun

Definition of MARKOV PROCESS

:  a stochastic process (as Brownian motion) that resembles a Markov chain except that the states are continuous; also :  markov chain —called also Markoff process
ADVERTISEMENT

First Known Use of MARKOV PROCESS

1938

Browse

Next Word in the Dictionary: marksmanPrevious Word in the Dictionary: MarkovianAll Words Near: Markov process
How to use a word that (literally) drives some people nuts.
Test your vocab with our fun, fast game
Ailurophobia, and 9 other unusual fears