Definition of Markov Chain in English :

Define Markov Chain in English

Markov Chain meaning in English

Meaning of Markov Chain in English

Pronunciation of Markov Chain in English

Markov Chain pronunciation in English

Pronounce Markov Chain in English

Markov Chain

see synonyms of markov chain

Noun

1. markoff chain, markov chain

a Markov process for which the parameter is discrete time values

WordNet Lexical Database for English. Princeton University. 2010.


Markov Chain

see synonyms of markov chain
noun
statistics
a sequence of events the probability for each of which is dependent only on the event immediately preceding it

Collins English Dictionary. Copyright © HarperCollins Publishers