by Ove Frank2, M. L. Menéndez3 and L. Pardo4
Research Report 1997:3
2Department of Statistics,
Stockholm University, S-106 91 Stockholm
3Department of Applied Mathematics, Technical University of Madrid, 28040 Madrid
4Department of Statistics & O.R, Complutense University of Madrid, 28040 Madrid
A divergence measure between discrete probability distributions introduced by Csiszár (1967) generalizes the Kullback-Leibler information and several other information measures considered in the literature. We introduce a weighted divergence which generalizes the weighted Kullback-Leibler information considered by Taneja (1985). The weighted divergence between an empirical distribution and a fixed distribution and the weighted divergence between two independent empirical distributions are here investigated for large simple random samples, and the asymptotic distributions are shown to be either normal or equal to the distribution of a linear combination of independent chi-square-variables.
Key words: information, entropy, divergence, goodness-of-fit, asymptotic sampling distributions.
Close this Window
Last update: 1997-12-16 / KH