- #1
CaptainP
- 11
- 0
Pretty theoretical question here. I was talking with one of my friends the other day about a basic statistics problem that utilizes random variates. The problem asked us to perform 20 simulations of the world series final using a U(0,1) distribution. One team was given a probability of winning a given game of .55, the other team .45, and the series is won best 4 out of 7. My friend would generate one variate at a time representing each game as it was played, so some of his simulations used just 4 variates while others used 5, 6, or 7. I generated an ordered set of 7 variates for every single simulation, and worked my way through them until a winner was determined, so some of the simulations did not utilize all of the variates generated for them. Is my method improper? Is there something wrong with "skipping" or "wasting" random variates like I did?