Mar·kov·ian mär-ˈkō-vē-ən -ˈkȯ-
: of, relating to, or resembling a Markov process or Markov chain especially by having probabilities defined in terms of transition from the possible existing states to other states
Love words? Need even more definitions?Merriam-Webster unabridged
See Definitions and Examples »
Get Word of the Day daily email!
Words at Play
Palter, Dissemble, and Other Words for Lying
Skunk, Bayou, and Other Words with Native American Origins
You've used more than you might think
Words For Things You Didn't Know Have Names, Vol. 2
When 'thingamajig' and 'thingamabob' just won't do
When Were Words First Used?
Look up any year to find out
Ask the Editors