Synonyms of Markov Chain in English :
Antonyms of Markov Chain in English
Thesaurus of Markov Chain in English
Markov Chain
see definition of markov chainSynonyms of markov chain
1. (noun) a Markov process for which the parameter is discrete time values
Hypernyms of markov chain
1. (noun) a Markov process for which the parameter is discrete time values