Institute of Information Theory and Automation

You are here

Bibliography

Journal Article

Optimality conditions for maximizers of the information divergence from an exponential family

Matúš František

: Kybernetika vol.43, 5 (2007), p. 731-746

: CEZ:AV0Z10750506

: IAA100750603, GA AV ČR

: Kullback-Leibler divergence, relative entropy, exponential family, information projection, log-Laplace transform, cumulant generating function, directional derivatives, convex functions, first order optimality conditions, polytopes

(eng): The information divergence of a probability measure P from an exponential family E over a finite set is defined as infimum of the divergences of P from Q over Q in E. All directional derivatives of the divergence from E are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for P to be a maximizer of the divergence from E are presented, including new ones when P is not projectable to E.

(cze): Informační divergence pravděpodobnostní měry P od exponenciální rodiny E se definuje jako infimum informačních divergencí P od Q cez Q v E. Pro diskrétní E byly explicitně nalezeny všechny směrové derivace informační divergence of E. Za tím účelem bylo studováno chování konjugované funkce k log-Laplacově transformaci. Byly nalezeny všechny podmínky optimality prvního řádu.

: BA

2019-01-07 08:39