Standard deviation as a percent?

In summary, to find the percentage points by which a sample deviates from the mean, you first calculate the standard deviation by taking the square root of the sum of the squared differences between each data point and the mean, divided by the number of data points. Then, to find the percentage of the sample within one standard deviation of the mean, you take the mean plus or minus the standard deviation and calculate the probability of a data point falling within that range. In this case, with a sample space of 2, 3, and 5, the probability is 33.3%.
  • #1
zeromodz
246
0
Say if I have a sample space 2,3, and 5. I want to find by what percent points deviate from the mean. So I would take the standard deviation as follows.

2+3+5 / 3 = 3.33

(2-3.33)^2 + (3-3.33)^2 + (5-3.33)^2 / 3 = 1.55

(1.55)^(1/2) = 0.775

So we get a standard deviation of 0.775. So how do I turn the standard deviation into a plus or minus percentage?
 
Physics news on Phys.org
  • #2
Step 1: √1.55555 = 1.25
Step 2: 100(1.25/3.33)
 
  • #3
It's important that you say that the three points, 2, 3, and 5 are equally likely results. That is implied by your calculation of the mean = (2+3+5)/3, where the three points each have probability 1/3 of occurring.

As @mathman points out, your calculation of the standard deviation is wrong. You divided by 2 instead of taking the square root. The correct value is 1.24722.

You want to know what percentage of a sample will be within 1 standard deviation of the mean. That is between 3.33333 - 1.24722 and 3.33333 + 1.24722. (between 2.086 and 4.581. Only the results X=3 are in that range. That probability is 1/3 = 33.3%
 
Last edited:

FAQ: Standard deviation as a percent?

What is standard deviation as a percent?

Standard deviation as a percent is a measure of how much the data values in a set deviate from the mean, expressed as a percentage of the mean.

How is standard deviation as a percent calculated?

To calculate standard deviation as a percent, first calculate the standard deviation of the data set. Then, divide the standard deviation by the mean and multiply by 100 to get the percentage.

What does a high standard deviation as a percent indicate?

A high standard deviation as a percent indicates that the data values are spread out widely from the mean, suggesting that the data set has a large amount of variability.

What does a low standard deviation as a percent indicate?

A low standard deviation as a percent indicates that the data values are closely clustered around the mean, suggesting that the data set has a small amount of variability.

How is standard deviation as a percent used in data analysis?

Standard deviation as a percent is used in data analysis to understand the spread of data values within a data set. It can also be used to compare the variability of different data sets.

Similar threads

Replies
39
Views
1K
Replies
10
Views
460
Replies
4
Views
1K
Replies
3
Views
1K
Replies
24
Views
4K
Replies
1
Views
1K
Back
Top