Akaike information criterion

Summary

The Akaike Information Criterion (AIC) is a frequentist model selection criterion typically used to regularize maximum likelihood estimators. The AIC provides a relative estimate of the quality of the tested models (so it is necessary to report the AIC differences between models), but the AIC will not indicate if all tested models poorly describe the data. The quality score is a relative estimate of the kl-divergence between the given model and the true model.

Context

This concept has the prerequisites:

Goals

  • Know the definition of the AIC.
  • How is the AIC justified in terms of performance on held-out data?

Core resources (we're sorry, we haven't finished tracking down resources for this concept yet)

Supplemental resources (the following are optional, but you may find them useful)

-Free-

Model Selection and Model Averaging in Phylogenetics: Advantages of Akaike Information Criterion and Bayesian Approaches Over Likelihood Ratio Tests (2004)
Location: Akaike Information Criterion (bottom of 6th page)
Authors: David Posada,Thomas Buckley

See also