Markovian

play
adjective Mar·kov·ian \mär-ˈkō-vē-ən, -ˈkȯ-\

Definition of Markovian

  1. :  of, relating to, or resembling a Markov process or Markov chain especially by having probabilities defined in terms of transition from the possible existing states to other states

1944

First Known Use of markovian

1944

Variants of markovian

or

Markov

play \ˈmär-ˌkȯf, -ˌkȯv\ or less commonly

Markoff

\ˈmär-ˌkȯf\

Learn More about markovian


Seen and Heard

What made you want to look up Markovian? Please tell us where you read or heard it (including the quote, if possible).

WORD OF THE DAY

a principle or belief held by a group

Get Word of the Day daily email!