Definition of Markoff Chain

  • 1. A Markov process for which the parameter is discrete time values Noun

Synonyms for word "markoff chain"

Semanticaly linked words with "markoff chain"

Hyponims for word "markoff chain"