- #1
fog37
- 1,569
- 108
- TL;DR Summary
- OLS in GLM models...
Hello,
I know this is a big topic but I would like to check that what I understand so far is at least correct. Will look more into it. GLM is a family of statistical models in which the coefficients betas are "linear". The relation between ##Y## and the covariates ##Xs## can be nonlinear (ex: polynomial regression and logistic regression). The relation we need to look at is the one between the link function and the coefficients. For example, for logistic regression, the probability ##p## is related to the covariates ##X## via a sigmoid equation and ##p## and the ##\beta##s are not in a linear relation. But the logit and the ##\beta##s are!
Am I on the right track? Any corrections? Thank you!
I know this is a big topic but I would like to check that what I understand so far is at least correct. Will look more into it. GLM is a family of statistical models in which the coefficients betas are "linear". The relation between ##Y## and the covariates ##Xs## can be nonlinear (ex: polynomial regression and logistic regression). The relation we need to look at is the one between the link function and the coefficients. For example, for logistic regression, the probability ##p## is related to the covariates ##X## via a sigmoid equation and ##p## and the ##\beta##s are not in a linear relation. But the logit and the ##\beta##s are!
- OLS is the "best" method to find the unknown coefficients when the model is linear regression (simple or multiple). OLS is also the "best" method when the model is polynomial regression (linear regression being a special case of it).
- However, in the case of logistic regression, we cannot use OLS to compute the estimated coefficients.. I initially wondered why since the log of the odd is a linear function of the covariates is a straight line model: $$log(odd)=\beta_1 X_1+\beta_2 X_2+...+\beta_0$$
Am I on the right track? Any corrections? Thank you!