English
Detailed Synonyms for Markov chain in English
Markov chain:
Markov chain [the ~] nom
-
the Markov chain
– a Markov process for which the parameter is discrete time values 1the Markoff chain; the Markov chain– a Markov process for which the parameter is discrete time values 1