Quantcast
Merriam-Webster Logo
  • Dictionary
  • Thesaurus
  • Scrabble
  • Spanish Central
  • Learner's Dictionary

Markov process

noun

Definition of Markov process

  1. :  a stochastic process (as Brownian motion) that resembles a Markov chain except that the states are continuous; also :  markov chain —called also Markoff process



1938

First Known Use of markov process

1938


Learn More about markov process


Seen and Heard

What made you want to look up Markov process? Please tell us where you read or heard it (including the quote, if possible).

WORD OF THE DAY

to expose to danger or risk

Get Word of the Day daily email!