In the propositional setting, the marginal problem is to find a (maximum-entropy) distribution that has some given marginals. We study this problem in a relational setting and make the following contributions. First, we compare two different notions of relational marginals. Second, we show a duality between the resulting relational marginal problems and the maximum likelihood estimation of the parameters of relational models, which generalizes a well-known duality from the propositional setting. Third, by exploiting the relational marginal formulation, we present a statistically sound method to learn the parameters of relational models that will be applied in settings where the number of constants differs between the training and test data. Furthermore, based on a relational generalization of marginal polytopes, we characterize cases where the standard estimators based on feature's number of true groundings needs to be adjusted and we quantitatively characterize the consequences of these adjustments. Fourth, we prove bounds on expected errors of the estimated parameters, which allows us to lower-bound, among other things, the effective sample size of relational training data.
Relational Marginal Problems: Theory and Estimation
Ondřej Kuželka,Yuyi Wang,Jesse Davis,S. Schockaert
Published 2017 in AAAI Conference on Artificial Intelligence
ABSTRACT
PUBLICATION RECORD
- Publication year
2017
- Venue
AAAI Conference on Artificial Intelligence
- Publication date
2017-09-18
- Fields of study
Mathematics, Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-17 of 17 references · Page 1 of 1
CITED BY
Showing 1-25 of 25 citing papers · Page 1 of 1