Markov process


Markov process

noun

Definition of MARKOV PROCESS

:  a stochastic process (as Brownian motion) that resembles a Markov chain except that the states are continuous; also :  markov chain —called also Markoff process

First Known Use of MARKOV PROCESS

1938

Browse

Next Word in the Dictionary: marksman
Previous Word in the Dictionary: Markovian
All Words Near: Markov process

Seen & Heard

What made you want to look up Markov process? Please tell us where you read or heard it (including the quote, if possible).