# Bibliography

Abstract

### Information-theoretic approach to combining expert opinions in probabilistic form

**: **Abstracts of the 6th Ritsumeikan-Monash Symposium on Probability and Related Fields

**: **6th Ritsumeikan-Monash Symposium on Probability and Related Fields,
(Biwako-Kusatsu Campus, Ritsumeikan University, Shiga, JP, 20161111)

**: **GA16-09848S, GA AV ČR

**: **combining expert opinions,
minimum cross-entropy principle

**: **http://library.utia.cas.cz/separaty/2016/AS/seckarova-0467172.pdf

**(): **The aggregation of experts' opinions, expressed as probabilities assigned to possible events, is of great importance in many branches of decision making, economics, social sciences. We propose a systematic way how to combine discrete probability distributions based on the Bayesian decision making theory and theory of information, namely the cross-entropy (also known as the Kullback-Leibler (KL) divergence). The optimal combination is a probability mass function minimizing the conditional expected KL-divergence. The expectation is taken with respect to a probability density function (pdf) also minimizing the KL-divergence under problem-rejecting constraints. For the Dirichlet distribution being this pdf the resulting combination is linear with weights related to above mentioned constraints. We next compare this combination with other KL-divergence based combinations linear (lin) and logarithmic (log) with equal weights. When an event assigned higher probability occurs, proposed combination performs similarly to the lin combination and outperforms log combination. When low probability event occurs, proposed combination outperforms both, lin and log combination. Thus, proposed combination improves decision making in areas such as crowd modelling (pedestrian movement) and betting (predictions for football games results).