Markov process


Definition of MARKOV PROCESS

:  a stochastic process (as Brownian motion) that resembles a Markov chain except that the states are continuous; also :  markov chain —called also Markoff process

First Known Use of MARKOV PROCESS



Next Word in the Dictionary: marksmanPrevious Word in the Dictionary: MarkovianAll Words Near: Markov process
May 22, 2015
nepotism Hear it
favoritism based on kinship
Take a 3-minute break and test your skills!
How to use a word that (literally) drives some people nuts.
Test your vocab with our fun, fast game
Ailurophobia, and 9 other unusual fears