Markovian
adjective
                                                                                                                            
                                                            Mar·kov·ian
                    
                                                                                                              
                                                                                                                                  mär-ˈkō-vē-ən  -ˈkȯ-
                                                                                                                                  -ˈkȯ-  
                                                      
                                                          
            
                                
              
          
                                                      : of, relating to, or resembling a Markov process or Markov chain especially by having probabilities defined in terms of transition from the possible existing states to other states                                      
                
                    
Love words? Need even more definitions?
  
  Merriam-Webster unabridged




Share