Definition of Markovian
: of, relating to, or resembling a Markov process or Markov chain especially by having probabilities defined in terms of transition from the possible existing states to other states
First Known Use of markovian
Variants of markovian
Markovplay \ˈmär-ˌkȯf, -ˌkȯv\ or less commonly
Learn More about markovian
Seen and Heard
What made you want to look up Markovian? Please tell us where you read or heard it (including the quote, if possible).