Markov process

noun

Definition of Markov process

  1. :  a stochastic process (such as Brownian motion) that resembles a Markov chain except that the states are continuous; also :  markov chain —called also Markoff process

1938

First Known Use of markov process

1938


Learn More about markov process


Seen and Heard

What made you want to look up Markov process? Please tell us where you read or heard it (including the quote, if possible).

Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!

WORD OF THE DAY

capable of being altered or controlled

Get Word of the Day daily email!

Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!