Markovian

3 ENTRIES FOUND:

Mar·kov·ian

adjective \mär-ˈkō-vē-ən, -ˈk-\

Definition of MARKOVIAN

:  of, relating to, or resembling a Markov process or Markov chain especially by having probabilities defined in terms of transition from the possible existing states to other states

Variants of MARKOVIAN

Mar·kov·ian or Mar·kov \ˈmär-ˌkf, -ˌkv\ also Mar·koff \ˈmär-ˌkf\

First Known Use of MARKOVIAN

1944

Rhymes with MARKOVIAN

Browse

Next Word in the Dictionary: Markov process
Previous Word in the Dictionary: Markov chain
All Words Near: Markovian

Seen & Heard

What made you want to look up Markovian? Please tell us where you read or heard it (including the quote, if possible).