Synonyms of Markov Chain in English :

Antonyms of Markov Chain in English

Thesaurus of Markov Chain in English

Markov Chain

see definition of markov chain

Synonyms of markov chain

1. (noun) a Markov process for which the parameter is discrete time values

Hypernyms of markov chain

1. (noun) a Markov process for which the parameter is discrete time values