Unifying the Derivations of Kullback Information Criterion and Corrected Versions

Authors

  • Warangkhana Keerativibool Department of Mathematics and Statistics, Faculty of Science, Thaksin University, Phatthalung 93110, Thailand.

Keywords:

KIC, KICc, Kullback’s directed divergence, Kullback’s symmetric divergence, model selection

Abstract

The Kullback information criterion (KIC) was proposed by Cavanaugh (1999) to serve as an asymptotically unbiased estimator of a variant of Kullback’s symmetric divergence between the true and fitted candidate models. It was arguably more sensitive than the criterion based on the directed divergence. However, for a small sample size or if the dimension of candidate model is large relative to the sample size, it displayed a large negative bias. Many authors, Cavanaugh (2004), Seghouane and Bekara (2004), Hafidi and Mkhadri (2006), proposed the criteria to correct this bias, i.e., the corrected versions of KIC called, respectively, in this paper KICcC, KICcSB, and KICcHM. Because they have multiple formulas, the aims of this paper are to unify and examine the performance of them relative to the AIC family of criteria, using theoretical and extensive simulation study methods. The unifications of the criteria based on Kullback’s symmetric divergence show that KICcC is closest to the expected estimated symmetric discrepancy and has the strongest penalty function under the condition (1-p/n)exp(p-n)<1, followed, respectively, by KICcSB, KICcHM, and KIC. This result makes KICcC has the highest efficiency even though it is likely to select an underfitted model.

Downloads

How to Cite

Keerativibool, W. (2015). Unifying the Derivations of Kullback Information Criterion and Corrected Versions. Thailand Statistician, 12(1), 37–53. Retrieved from https://ph02.tci-thaijo.org/index.php/thaistat/article/view/34203

Issue

Section

Articles