Bibliography
Journal Article
Optimal design of priors constrained by external predictors
, ,
: International Journal of Approximate Reasoning vol.84, 1 (2017), p. 150-158
: GA16-09848S, GA ČR
: Fully probabilistic design, Parameter prior, External predictive distribution, Bayesian transfer learning, Kullback–Leibler divergence
: http://library.utia.cas.cz/separaty/2017/AS/guy-0473911.pdf
(eng): This paper exploits knowledge made available by an external source in the form of a predictive distribution in order to elicit a parameter prior. It uses the terminology of Bayesian transfer learning, one of many domains dealing with reasoning as coherent knowledge processing. An empirical solution of the addressed problem was provided in [19], based on an interpretation of the external predictor as an empirical distribution constructed from fictitious data. In this paper, two main contributions are provided. First, the problem is solved using formal hierarchical Bayesian modeling [25], and the knowledge transfer is achieved optimally, i.e. in the minimum-KLD sense. Second, this hierarchical setting yields a distribution on the set of possible priors, with the choice [19] acting as the base distribution. This allows randomized choices of the prior to be generated, avoiding costly and/or intractable estimation of this prior. It also provides measures of uncertainty in the prior choice, allowing subsequent learning tasks to be assessed for robustness to this prior choice. The instantiation of the method in already published applications in knowledge elicitation, recursive learning and flat cooperation of adaptive controllers is recalled, and prospective application domains are also mentioned.
: BC
: 10201