all:
glm(formula = cbind(Menarche, Total - Menarche) ~ Age, family = binomial(logit),
data = menarche)
Deviance Residuals:
Min 1Q Median 3Q Max
-2.0363 -0.9953 -0.4900 0.7780 1.3675
Coefficients:
Estimate Std. Error z value Pr(>|z|)
(Intercept) -21.22639 0.77068 -27.54 <2e-16 ---="" 0.001="" 0.01="" 0.05895="" 0.05="" 0.1="" 0="" 1.63197="" 114.76="" 1="" 23="" 24="" 26.703="" 27.68="" 3693.884="" 4="" age="" aic:="" be="" binomial="" codes:="" degrees="" deviance:="" e-16="" family="" fisher="" for="" freedom="" ispersion="" iterations:="" null="" number="" of="" on="" parameter="" pre="" residual="" scoring="" signif.="" taken="" to="">The following requests also produce useful results: glm.out$coef, glm.out$fitted, glm.out$resid, glm.out$effects, and anova(glm.out).
Recall that the response variable is log odds, so the coefficient of "Age" can be interpreted as "for every one year increase in age the odds of having reached menarche increase by exp(1.632) = 5.11 times."
To evaluate the overall performance of the model, look at the null deviance and residual deviance near the bottom of the print out. Null deviance shows how well the response is predicted by a model with nothing but an intercept (grand mean). This is essentially a chi square value on 24 degrees of freedom, and indicates very little fit (a highly significant difference between fitted values and observed values). Adding in our predictors--just "Age" in this case--decreased the deviance by 3667 points on 1 degree of freedom. Again, this is interpreted as a chi square value and indicates a highly significant decrease in deviance. The residual deviance is 26.7 on 23 degrees of freedom. We use this to test the overall fit of the model by once again treating this as a chi square value. A chi square of 26.7 on 23 degrees of freedom yields a p-value of 0.269. The null hypothesis (i.e., the model) is not rejected. The fitted values are not significantly different from the observed values.
http://ww2.coastal.edu/kingw/statistics/R-tutorials/logistic.html2e-16>
This blog is about my learnings in big data, product management and digital advertising.
Wednesday, September 17, 2014
Interpreting Logistic Regression Results
Labels:
R
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment