Definition of Markov process
: a stochastic process (as Brownian motion) that resembles a Markov chain except that the states are continuous; also : markov chain —called also Markoff process
First Known Use of markov process
Learn More about markov process
Britannica.com: Encyclopedia article about Markov process
Seen and Heard
What made you want to look up Markov process? Please tell us where you read or heard it (including the quote, if possible).