** by Ove
Frank ^{2}, M. L. Menéndez^{3} and L. Pardo^{4}**

** Research
Report 1997:3**

^{2}Department of Statistics,
Stockholm University, S-106 91 Stockholm
^{3}Department of Applied
Mathematics, Technical University of Madrid, 28040 Madrid
^{4}Department of Statistics
& O.R, Complutense University of Madrid, 28040 Madrid

**Abstract**

A divergence measure between discrete probability distributions introduced by Csiszár (1967) generalizes the Kullback-Leibler information and several other information measures considered in the literature. We introduce a weighted divergence which generalizes the weighted Kullback-Leibler information considered by Taneja (1985). The weighted divergence between an empirical distribution and a fixed distribution and the weighted divergence between two independent empirical distributions are here investigated for large simple random samples, and the asymptotic distributions are shown to be either normal or equal to the distribution of a linear combination of independent chi-square-variables.

**Key words:** information,
entropy, divergence, goodness-of-fit, asymptotic sampling distributions.

*Last update: 1997-12-16 / KH*