A stochastic process such that the conditional probability distribution for a state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.

Synonyms: Markov Chains, Processes, Markov, Chains, Markov, Markov Chain, Markov Process, Chain, Markov, Markov Processes, Process, Markov

Instance information

comment

search PROBABILITY 1968-74 ,91; was see under PROBABILITY 1975-90 ,91(75); was see under PROBABILITY 1975-90

identifier

D008390