Bayesian Inference Reuse in Multilevel Models

David Tolpin

Published 2026 in ACM Transactions on Probabilistic Machine Learning

ABSTRACT

We address efficient Bayesian inference in multilevel models, where group-specific latent variables are drawn from a shared hyperprior. In standard approaches, inferring the posterior for a new group requires revisiting all previous groups, incurring growing computational cost due to increased data volume and latent dimensionality. We propose replacing past groups with a set of weighted virtual observations of latent variables that preserve the prior over new groups, enabling fast, scalable inference. We provide theoretical analysis, empirical validation on case studies, and a reference implementation compatible with common probabilistic programming languages and inference algorithms.

PUBLICATION RECORD

  • Publication year

    2026

  • Venue

    ACM Transactions on Probabilistic Machine Learning

  • Publication date

    2026-02-28

  • Fields of study

    Not labeled

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-34 of 34 references · Page 1 of 1

CITED BY

  • No citing papers are available for this paper.

Showing 0-0 of 0 citing papers · Page 1 of 1