- #1
dmlockwood
- 2
- 0
Lets say, that the Los Angeles Dodgers have an average runs scored of 5.25 (the sample mean). The standard deviation across this sample is 3.15. The Colorado Rockies have an average runs scored of 4.45 and a standard deviation of 3.65. Assuming normal distributions, how would we determine the probability of the Rockies scoring more runs than the Dodgers, and vise versa.
I know that I am really generalizing things here...there is more to it. This example illustrates what I am trying to do though...compare two normal distrubutions. I have taken statistics courses, but it's been a while...any help on this would be greatly appreciated.
I know that I am really generalizing things here...there is more to it. This example illustrates what I am trying to do though...compare two normal distrubutions. I have taken statistics courses, but it's been a while...any help on this would be greatly appreciated.