Definition of Markov Chain

  • 1. A Markov process for which the parameter is discrete time values Noun

Synonyms for word "markov chain"

Semanticaly linked words with "markov chain"

Hyponims for word "markov chain"