*** Welcome to piglix ***

Akaike information criterion


The Akaike information criterion (AIC) is a measure of the relative quality of statistical models for a given set of data. Given a collection of models for the data, AIC estimates the quality of each model, relative to each of the other models. Hence, AIC provides a means for model selection.

AIC is founded on information theory: it offers a relative estimate of the information lost when a given model is used to represent the process that generates the data. In doing so, it deals with the trade-off between the goodness of fit of the model and the complexity of the model.

AIC does not provide a test of a model in the sense of testing a null hypothesis, so it can tell nothing about the quality of the model in an absolute sense. If all the candidate models fit poorly, AIC will not give any warning of that.

Suppose that we have a statistical model of some data . Let be the number of estimated parameters in the model. Let be the maximized value of the likelihood function for the model; i.e. , where are the parameter values that maximize the likelihood function. Then the AIC value of the model is the following.


...
Wikipedia

...