Extra Sum of Squares [Statistics]

The 8454.96 is probably the "sum of squares of differences of the means" or similar.In summary, the conversation discusses a problem regarding the calculation of the Extra Sum of Squares for an ANOVA test. The solution is given as 113, but the individual is struggling to find the correct calculation. They attempt to calculate the Total Sum of Squares by finding the correction for mean and the sum of squares of all observations, but their answer is incorrect. They ask for help in understanding where they went wrong and how to fix it.
  • #1
cybernerd
27
0

Homework Statement



I'm trying to study off an old practice midterm for my upcoming statistics midterm. The midterm is attached.

Part One is the following problem:

The effect of a new antidepressant drug on reducing the severity of depression was studied in manic-depressive patients at three mental hospitals. In each hospital all such patients were randomly assigned to either a treatment (new drug) or a control (old drug) group with different doses. The results of this experiment are summarized in the following tables; a high mean score indicates more of a lowering in depression level than does a low mean score.

Summary Statistics for 6 Groups:

http://a8.sphotos.ak.fbcdn.net/hphotos-ak-ash4/429073_10150583998056837_710136836_9423368_1227940643_n.jpg

For the ANOVA, I am given only one value.

Sum of Squares Within Groups: 58.5

I am then asked to calculate the Extra sum of squares to the nearest integer.

Homework Equations



Extra Sum of Squares = Total Sum of Squares - Within Groups Sum of Squares = Between Groups Sum of Squares

Total Sum of Squares = Sum of Squares of All Observations - Correction for Mean

Correction for Mean = (total of all observations) ^2 / N

The Attempt at a Solution



I know from the answer key that the solution is 113.

I know I need to find the Total Sum of Squares, so I started by trying to calculate the correction for mean. I don't have the actual data, just the means, so I tried calculating:

Sum of Squares of All Observations = (n1*x1 + n2*x2 + ... + n6*x6)^2, which gave me 223.6.

223. 6 squared = 49996.96, which, divided by 42, is 1190.4.

Then I tried to add up the sum of squares of all observations by multiplying each mean by its n, then squaring that, for all 6 groups. Then I added all 6 together, such that:

(n1*x1)^2 + (n2*x2)^2 and so forth

= (8*8)^2 + (6*5.5)^2 + (10*5)^2 + (9*3)^2 + (4 * 6.4)^2 + (5*4.8)^2

= 9645.36

So Total would be:

9645.36 - 1190.4 = 8454.96.

... Which is laughably wrong.

Can anybody tell me where I screwed up, and how I can fix it? I feel like I'm calculating this wrong altogether. Can anybody help me at all?

Thank you!
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
Through disuse, I have forgotten all content of my statistics subjects :frown:
and I don't understand the broader issue of what this question is about, nevertheless ...
Sum of Squares of All Observations = (n1*x1 + n2*x2 + ... + n6*x6)^2,
This comes no where near what I would expect for the "sum of squares" of anything.

https://www.physicsforums.com/images/icons/icon2.gif I'd be happy to go with: (x1^2)*n1 + (x2^2)*n2 + ...
 
Last edited by a moderator:

FAQ: Extra Sum of Squares [Statistics]

What is Extra Sum of Squares and why is it important in statistics?

Extra Sum of Squares (ESS) is a statistical method used to compare the fit of two regression models. It measures the additional sum of squares that is explained by adding a specific variable to the model. ESS is important because it allows us to determine if adding a variable significantly improves the model's predictive power.

How is Extra Sum of Squares calculated?

The ESS is calculated by taking the difference between the sum of squares for the full model (with all variables included) and the sum of squares for the reduced model (without the variable in question). This difference is then divided by the difference in degrees of freedom between the two models.

What is the significance of the F-statistic in Extra Sum of Squares?

The F-statistic is used to determine the significance of the extra sum of squares. It compares the explained variance of the reduced model to the explained variance of the full model and determines if the addition of the variable significantly improves the model's fit. A higher F-statistic indicates a more significant improvement in the model's fit.

When should Extra Sum of Squares be used?

ESS should be used when comparing the fit of two regression models with different numbers of variables. It is commonly used in ANOVA (Analysis of Variance) tests to determine the significance of adding a new variable to the model. ESS can also be used in other types of statistical analyses, such as linear regression and multiple regression.

What are some limitations of Extra Sum of Squares?

One limitation of ESS is that it assumes a linear relationship between the variables being compared. It may not be suitable for non-linear relationships. Additionally, ESS can only be used to compare two models at a time, so it may not be appropriate for comparing multiple models simultaneously. It is also important to note that a significant ESS does not necessarily indicate a causal relationship between the added variable and the outcome variable.

Back
Top