Created by Lewis Lehe (of Setosa and UC Berkeley Trans. Eng.) with design and art by Dennys Hess. Thanks to Ian Johnson for good advice.| setosa.io
Markov chains, named after Andrey Markov, are mathematical systems that hop from one "state" (a situation or set of values) to another. For example, if you made a Markov chain model of a baby's behavior, you might include "playing," "eating", "sleeping," and "crying" as states, which together with other behaviors could form a 'state space': a list of all possible states. In addition, on top of the state space, a Markov chain tells you the probabilitiy of hopping, or "transitioning," from one ...| setosa.io
Back| Explained Visually