Bibliografie
Conference Paper (international conference)
Computationally efficient probabilistic inference with noisy threshold models based on a CP tensor decomposition
,
: Proceedings of The Sixth European Workshop on Probabilistic Graphical Models, p. 355-362
: The Sixth European Workshop on Probabilistic Graphical Models, (Granada, ES, 19.09.2012-21.09.2012)
: GA102/09/1278, GA ČR, GA201/08/0539, GA ČR
: probabilistic graphical models, probabilistic inference, CP tensor decomposition
(eng): Conditional probability tables (CPTs) of threshold functions represent a generalization of two popular models – noisy-or and noisy-and. They constitute an alternative to these two models in case they are too rough. When using the standard inference techniques the inference complexity is exponential with respect to the number of parents of a variable. In case the CPTs take a special form (in this paper it is the noisy-threshold model) more efficient inference techniques could be employed. Each CPT defined for variables with finite number of states can be viewed as a tensor (a multilinear array). Tensors can be decomposed as linear combinations of rank-one tensors, where a rank one tensor is an outer product of vectors. Such decomposition is referred to as Canonical Polyadic (CP) or CANDECOMP-PARAFAC (CP) decomposition. The tensor decomposition offers a compact representation of CPTs which can be efficiently utilized in probabilistic inference.
: JD