Markov chain/Definition: Difference between revisions
Jump to navigation
Jump to search
imported>Meg Taylor (add) |
(No difference)
|
Latest revision as of 09:41, 4 September 2009
Markov process whose state space is finite or countably infinite.