- #1
dranglerangus
- 3
- 0
I was watching a youtube video from MIT's open courseware series on probability. A scenario was proposed: Al is waiting for a bus. The probability that the bus arrives in x minutes is described by the random variable X, which is uniformly distributed on the interval [0,10] (in minutes).
I understand that the cdf of this function is F(x) = {0 for x<0}, {x for 0≤x≤10}, and {1 for x>10}.
This says that the probability is 1 that x≤10, or equivalently that x∈[0,10], right? So does this mean that the bus definitely arrived between 0 and 10 minutes? This seems counter-intuitive, since at any given moment Al was waiting, the probability that the bus would show up was 1/10. To me, this doesn't seem to guarantee that the bus would have to show up at some time in that interval.
The probability laws guarantee that the bus will come within 10 minutes, but it doesn't seem right to me. Am I understanding this incorrectly?
I understand that the cdf of this function is F(x) = {0 for x<0}, {x for 0≤x≤10}, and {1 for x>10}.
This says that the probability is 1 that x≤10, or equivalently that x∈[0,10], right? So does this mean that the bus definitely arrived between 0 and 10 minutes? This seems counter-intuitive, since at any given moment Al was waiting, the probability that the bus would show up was 1/10. To me, this doesn't seem to guarantee that the bus would have to show up at some time in that interval.
The probability laws guarantee that the bus will come within 10 minutes, but it doesn't seem right to me. Am I understanding this incorrectly?