- #1
yungman
- 5,742
- 290
Let say I have a random number generator with numbers in and include between +200 to -200. We know the average has to be zero if you sample infinite number of times. How about you are only allow to sample 30 times, what is the maximum error from ideal zero in 30 random sampling?
Is there a formula for this? I am not a math student and this is not homework. This is a real life engineering problem I am facing. I have 140mVrms of white noise( total random) riding on a DC level. I want to find the DC level. I want to know if I spot sample 30 times and take the average, how accurate I can get to the true DC level.
Thanks
Is there a formula for this? I am not a math student and this is not homework. This is a real life engineering problem I am facing. I have 140mVrms of white noise( total random) riding on a DC level. I want to find the DC level. I want to know if I spot sample 30 times and take the average, how accurate I can get to the true DC level.
Thanks