Complexity Explorer Santa Few Institute

Explore


Markov chain

A random process that undergoes transitions from one state to another, in which the probability distribution of the next state depends only on the current state and not on the sequence of events that preceded it.


Topics
Dynamical Systems, Mathematics
Difficulty
1