Consider this game. I've got a die with sides marked "win" and "lose", such that the probability of rolling "lose" is X. You roll the die; if you roll "lose", the game ends. If you roll "win", I give you a dollar and you roll again.
For example, if X is 0.15 (a 15% chance of rolling "lose"), the odds of you getting nothing is 15%. The odds of getting $1 is 85% * 15%. The odds of getting $2 is 85% * 85% * 15%.
Put generally, the probability of any result n is:
P(n) = x (1-x)^n
The question: How can I calculate the average expected payout for a given value of X? All I can figure out is it would be the sum of this infinite series:
Once upon a time I might have been able to figure out whether that converges on something I can express as a simpler formula, or at least calculate for a few selected values of X. But I don't even know where to start now.
Subscribe to:
Post Comments (Atom)
1 comment:
I worked on this some yesterday with some help from a colleague. The average or expected value is (1-X)/X.
I will explain next time I chat with you in an interactive setting, if you would like to hear the details.
Post a Comment