Markov process

noun

Definition of Markov process

: a stochastic process (such as Brownian motion) that resembles a Markov chain except that the states are continuous also : markov chain

called also Markoff process

First Known Use of Markov process

1938, in the meaning defined above

Keep scrolling for more

Learn More about Markov process

Statistics for Markov process

Look-up Popularity

Time Traveler for Markov process

The first known use of Markov process was in 1938

See more words from the same year

Comments on Markov process

What made you want to look up Markov process? Please tell us where you read or heard it (including the quote, if possible).

WORD OF THE DAY

suitable to be imparted to the public

Get Word of the Day daily email!

Test Your Vocabulary

Where in the World? A Quiz

  • peter bruegel tower of babel painting
  • What language does pajama come from?
Spell It

Can you spell these 10 commonly misspelled words?

TAKE THE QUIZ
SCRABBLE® Sprint

Test Your Knowledge - and learn some interesting things along the way.

TAKE THE QUIZ
Love words? Need even more definitions?

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!