Definition of Markov chain
: a usually discrete stochastic process (such as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved
— called also Markoff chain
Keep scrolling for more
Love words? Need even more definitions?Merriam-Webster unabridged
Words at Play
- The Good, The Bad, & The Semantically Imprecise - 10/18/19
- Noah Webster's Spelling Wins and Fails
- The Disputed Origins of the 'Egg Cream'
- At the Root of ‘Cubicle’
Ask the Editors
- Why Do People Pronounce It "Nucular"?
- On Contractions of Multiple Words
- Is Singular 'They' a Better Choice?
- Where in the World? A Quiz Take the quiz
- Advanced Vocabulary Quiz Take the quiz
- Spell It Take the quiz
- Bee Cubed Play the game