Markov process

noun

Definition of Markov process 

: a stochastic process (such as Brownian motion) that resembles a Markov chain except that the states are continuous also : markov chain

called also Markoff process

First Known Use of Markov process

1938, in the meaning defined above

Keep scrolling for more

Learn More about Markov process

Share Markov process

Statistics for Markov process

Look-up Popularity

Time Traveler for Markov process

The first known use of Markov process was in 1938

See more words from the same year

Comments on Markov process

What made you want to look up Markov process? Please tell us where you read or heard it (including the quote, if possible).

WORD OF THE DAY

the figure or shape of a crescent moon

Get Word of the Day daily email!

Test Your Vocabulary

Late Autumn 2018 Words of the Day Quiz

  • frosted-autumn-leaves
  • Which is a synonym of yahoo?
True or False

Test your knowledge - and maybe learn something along the way.

TAKE THE QUIZ
Word Winder's CrossWinder

Test Your Knowledge - and learn some interesting things along the way.

TAKE THE QUIZ

Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!