- #1
Agent Smith
- 332
- 35
- TL;DR Summary
- Expected number of shots on target
Say an archer has a probability p of hitting the target.
Given n shots at the target, the number of hits = np
The standard deviation of hits = ##\sqrt{np(1-p)}##
Say p = 0.7
Given 100 shots, my expected/average number of hits = ##100 \times 0.7 = 70##
The standard deviation for the number of hits = ##\sqrt {100 \times 0.7 \times 0.3} \approx 6##
We can assume this to be a normal distribution because the number of successes > 10 and the number of failures > 10.
Does this mean that (##2 \times 6 = 12##) 95% of the time the average number of hits = ##70 \pm 12## (mean ##\pm 2 \times## standard deviation)?
So we have an interval (58, 82). Is this a confidence interval or something else?
Given n shots at the target, the number of hits = np
The standard deviation of hits = ##\sqrt{np(1-p)}##
Say p = 0.7
Given 100 shots, my expected/average number of hits = ##100 \times 0.7 = 70##
The standard deviation for the number of hits = ##\sqrt {100 \times 0.7 \times 0.3} \approx 6##
We can assume this to be a normal distribution because the number of successes > 10 and the number of failures > 10.
Does this mean that (##2 \times 6 = 12##) 95% of the time the average number of hits = ##70 \pm 12## (mean ##\pm 2 \times## standard deviation)?
So we have an interval (58, 82). Is this a confidence interval or something else?
Last edited: