Standard deviation and standard error

In summary, the data has a mean of 21.2°C, a standard deviation of 2, and a standard error of 0.8. Depending on the context, the temperature could be reported as 21.2 ± 2°C or 21.2 ± 0.8°C. Without any indication, it is impossible to determine if the uncertainty figure refers to standard deviation or standard error. It is important to always clarify which measure of uncertainty is being used when reporting data.
  • #1
i_love_science
80
2
The mean of some data was 21.2°C, the standard deviation was 2, and the standard error was 0.8.

My textbook says that using one standard deviation, we would report the temperature of the substance as 21.2 ± 2°C, while using the standard error, the temperature would be reported as 21.2 ± 0.8°C.

If I did not know whether the temperature was reported using standard deviation or standard error, how would I determine that from the given value itself?

Thanks.
 
Chemistry news on Phys.org
  • #2
You can't. Without any indication, it could be SD or SE, or something else, e.g. 95% confidence interval (±2 SE). You should always make it clear what your uncertainty figure refers to.
 
  • Like
Likes i_love_science

FAQ: Standard deviation and standard error

What is the difference between standard deviation and standard error?

Standard deviation is a measure of how much the data points in a sample vary from the mean. It is used to describe the spread or dispersion of the data. Standard error, on the other hand, is a measure of how much the sample mean varies from the true population mean. It is used to estimate the precision of the sample mean.

How is standard deviation calculated?

Standard deviation is calculated by finding the square root of the variance, which is the average of the squared differences from the mean. It is represented by the symbol σ (sigma) and is commonly used to measure the variability of a set of data.

What does a high or low standard deviation indicate?

A high standard deviation indicates that the data points are spread out over a larger range, while a low standard deviation indicates that the data points are closer to the mean. In other words, a high standard deviation suggests that the data is more variable, while a low standard deviation suggests that the data is more consistent.

How is standard error different from standard deviation?

Standard error is a measure of how much the sample mean varies from the true population mean, while standard deviation is a measure of how much the data points in a sample vary from the mean. Standard error takes into account the sample size, while standard deviation does not.

Why is standard error important in statistical analysis?

Standard error is important in statistical analysis because it helps us determine the reliability of our sample mean. It allows us to estimate the range within which the true population mean is likely to fall, based on the variability of our sample. This helps us make more accurate inferences about the population based on our sample data.

Similar threads

Replies
39
Views
2K
Replies
3
Views
1K
Replies
2
Views
1K
Replies
1
Views
1K
Replies
7
Views
847
Replies
10
Views
703
Replies
3
Views
3K
Back
Top