subject
Mathematics, 19.02.2020 19:59 jtal

Make a Markov chain model of a poker game where the states are the number of dollars a player has. With probability .3 a player wins 1 dollar in a period, with probability .4 a player loses 1 dollar, and with probability . 3 a player stays the same. The game ends if the player loses all his or her money or if the player has 6 dollars (when the game ends, the Markov chain stays in its current state forever). The Markov chain should have seven states, corresponding to the seven different amounts of n1oney: 0, I , 2, 3, 4, 5, or 6 dollars. If you now have $2, what is your probability distribution in the next round? In the round after that?

ansver
Answers: 1

Another question on Mathematics

question
Mathematics, 21.06.2019 16:00
Sorry guys but as you have a small brain
Answers: 1
question
Mathematics, 21.06.2019 21:30
Jalen charges $25.50 for every 3 hours of yard work he does for his neighbors. ralph charges $34 for every 4 hours of work that he does. who charges less? opinions: 1.they charge the same amount 2.jalen 3.ralph
Answers: 1
question
Mathematics, 21.06.2019 22:30
There are 93 calories in a small candy bar how many calories are ther in a half dozen small candy bars?
Answers: 2
question
Mathematics, 22.06.2019 01:00
What is the y-intercept of the line with the equation 3x + 4y = 12?
Answers: 1
You know the right answer?
Make a Markov chain model of a poker game where the states are the number of dollars a player has. W...
Questions
question
Chemistry, 12.05.2021 03:00
question
History, 12.05.2021 03:00
question
Mathematics, 12.05.2021 03:00
question
Mathematics, 12.05.2021 03:00
question
Mathematics, 12.05.2021 03:00
question
Mathematics, 12.05.2021 03:00