Dictionary

Markov process

noun

Definition of MARKOV PROCESS

:  a stochastic process (as Brownian motion) that resembles a Markov chain except that the states are continuous; also :  markov chain —called also Markoff process

First Known Use of MARKOV PROCESS

1938

Browse

Next Word in the Dictionary: marksmanPrevious Word in the Dictionary: MarkovianAll Words Near: Markov process
March 29, 2015
discomfit Hear it
To thwart or disconcert
Take a 3-minute break and test your skills!
How to use a word that (literally) drives some people nuts.
Test your vocab with our fun, fast game
Ailurophobia, and 9 other unusual fears