Institute of Information Theory and Automation

You are here

Bibliography

Journal Article

Use of Kullback–Leibler divergence for forgetting

Kárný Miroslav, Andrýsek Josef

: International Journal of Adaptive Control and Signal Processing vol.23, 1 (2009), p. 1-15

: CEZ:AV0Z10750506

: 2C06001, GA MŠk, 1M0572, GA MŠk, GA102/08/0567, GA ČR

: Bayesian estimation, Kullback–Leibler divergence, functional approximation of estimation, parameter tracking by stabilized forgetting, ARX model

: 10.1002/acs.1080

: http://library.utia.cas.cz/separaty/2008/AS/karny-use%20of%20kullback-leibler%20divergence%20for%20forgetting.pdf

(eng): Non-symmetric Kullback–Leibler divergence (KLD) measures proximity of probability density functions (pdfs). Bernardo (Ann. Stat. 1979; 7(3):686–690) had shown its unique role in approximation of pdfs. The order of the KLD arguments is also implied by his methodological result. Functional approximation of estimation and stabilized forgetting, serving for tracking of slowly varying parameters, use the reversed order. This choice has the pragmatic motivation: recursive estimator often approximates the parametric model by a member of exponential family (EF) as it maps prior pdfs from the set of conjugate pdfs (CEF) back to the CEF. Approximations based on the KLD with the reversed order of arguments preserves this property. In the paper, the approximation performed within the CEF but with the proper order of arguments of the KLD is advocated. It is applied to the parameter tracking and performance improvements are demonstrated.

(cze): Nesymetrická Kullback-Leiblerova divergence (KLD) měří blízkost pravděpodobnostních hustot. Dá se ukázat, že jedna z jejich verzí je teoreticky lepší. Článek popisuje využití této skutečnosti ke zlepšení techniky zapomínání.

: BB

2019-01-07 08:39