Markov process

noun

Definition of Markov process

  1. :  a stochastic process (as Brownian motion) that resembles a Markov chain except that the states are continuous; also :  markov chain —called also Markoff process

1938

First Known Use of markov process

1938


Learn More about markov process


Seen and Heard

What made you want to look up Markov process? Please tell us where you read or heard it (including the quote, if possible).