Quantcast
Merriam-Webster Logo
  • Dictionary
  • Thesaurus
  • Medical
  • Scrabble
  • Spanish Central
  • Learner's Dictionary

Markov chain

noun

Definition of Markov chain

  1. :  a usually discrete stochastic process (as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved —called also Markoff chain



Origin of markov chain

A. A. Markov †1922 Russian mathematician


First Known Use: 1938


Learn More about markov chain


Seen and Heard

What made you want to look up Markov chain? Please tell us where you read or heard it (including the quote, if possible).

WORD OF THE DAY

February 9, 2016

marked by high spirits and laughter

Get Word of the Day daily email!

WORD GAMES

Take a 3-minute break and test your skills!

image1037863653

Which of the following refers to thin, bending ice, or to the act of running over such ice?

duvet spindrift pince-nez kittly-benders
Name That Thing

10 quick questions: hear them, spell them, and see how your skills compare to the crowd.

TAKE THE QUIZ
SCRABBLE® Sprint

Test Your Knowledge - and learn some interesting things along the way.

TAKE THE QUIZ