![]() If the number of independent variables is large. Lower AIC values usually indicate a better model that we will finally select. So we need to penalize models with many variables that don’t fit much better than models with fewer variables with Akaike Information Criterion (AIC). ![]() But this often results that we will choose the full model. The model the model with the highest F statistic or proportion of explained variation (PVE) (note: the concept was established with linear regression but can be applied to logistic regression as well) is selected. If the number of independent variables is not very large, you can just do “all subsets” regression in which all possible models are fit.
0 Comments
Leave a Reply. |