Markov process


Definition of Markov process

  1. :  a stochastic process (as Brownian motion) that resembles a Markov chain except that the states are continuous; also :  markov chain —called also Markoff process


First Known Use of markov process


Learn More about markov process

Seen and Heard

What made you want to look up Markov process? Please tell us where you read or heard it (including the quote, if possible).