- #1
Roger Dodger
- 42
- 3
In high school, I was taught that the standard deviation drops as you increase the sample size. For this reason, larger sample sizes produce less fluctuation. At the time, I didn't question this because it made sense.
Then, I was taught that the standard deviation does not drop as you increase sample size. Rather, it was the standard error that dropped. Sounded fine to me. And most of the resources I find agree -- the standard deviation might fluctuate slightly, but it does not drop with increasing sample size.
But today I came across the ISOBudgets: http://www.isobudgets.com/introduction-statistics-uncertainty-analysis/#sample-size. Here, it states "Have you ever wanted to reduce the magnitude of your standard deviation? Well, if you know how small you want the standard deviation to be, you can use this function to tell you how many samples you will need to collect to achieve your goal." It goes on to provide such a function for finding this minimum n, namely that √n = (desired confidence level) X (current standard deviation) / ( margin of error).
Am I misreading this? I used a random number generator to punch out about 500 random numbers on a normalized distribution and the standard deviation does not drop. What am I missing?
Then, I was taught that the standard deviation does not drop as you increase sample size. Rather, it was the standard error that dropped. Sounded fine to me. And most of the resources I find agree -- the standard deviation might fluctuate slightly, but it does not drop with increasing sample size.
But today I came across the ISOBudgets: http://www.isobudgets.com/introduction-statistics-uncertainty-analysis/#sample-size. Here, it states "Have you ever wanted to reduce the magnitude of your standard deviation? Well, if you know how small you want the standard deviation to be, you can use this function to tell you how many samples you will need to collect to achieve your goal." It goes on to provide such a function for finding this minimum n, namely that √n = (desired confidence level) X (current standard deviation) / ( margin of error).
Am I misreading this? I used a random number generator to punch out about 500 random numbers on a normalized distribution and the standard deviation does not drop. What am I missing?