- #1
evinda
Gold Member
MHB
- 3,836
- 0
Hello! (Wave)
Suppose that $k$ football matches are being done and a bet consists of the prediction of the result of each match, where the result can be 1 if the first group wins, 2 if the second group wins, or 0 if we have tie. So a bet is an element of $\{0,1,2 \}^k$. I want to show that $g(k) \geq \frac{3^k}{2k+1}$ where $g(k)$ is the minimum number of bets that is required so that it is sure that at least the second prize will be winned (there will be a bet with at most one wrong prediction).
Could you explain to me why the inequality holds?
Suppose that $k$ football matches are being done and a bet consists of the prediction of the result of each match, where the result can be 1 if the first group wins, 2 if the second group wins, or 0 if we have tie. So a bet is an element of $\{0,1,2 \}^k$. I want to show that $g(k) \geq \frac{3^k}{2k+1}$ where $g(k)$ is the minimum number of bets that is required so that it is sure that at least the second prize will be winned (there will be a bet with at most one wrong prediction).
Could you explain to me why the inequality holds?