Description

Consider a Bayesian model where the likelihood is a binomial distribution with probability parameter $p$. Let us consider an over-parameterized model where we write $p = p_1 p_2$. Assume that each $p_i$ has a uniform prior on the interval $[0, 1]$.

In summary:

\[\begin{aligned} p_1 &\sim \text{Unif}(0, 1) \\ p_2 &\sim \text{Unif}(0, 1) \\ y &\sim \text{Bin}(n, p_1 p_2) \end{aligned}\]

Here we use the values:

ParameterValue
Number of trials $n$100000
Number of successes $y$50000

This is a toy example of an unidentifiable parameterization. In practice many popular Bayesian models are unidentifiable.

When there are many observations, the posterior of unidentifiable models concentrate on a sub-manifold, making sampling difficult, as shown in the following pair plots.

Pair plot

Diagonal entries show estimates of the marginal densities as well as the (0.16, 0.5, 0.84) quantiles (dotted lines). Off-diagonal entries show estimates of the pairwise densities.

Movie linked below (🍿) superimposes 100 iterations of MCMC.

🔍 Full page 🍿 Movie 🔗 Info

Trace plots

🔍 Full page

Moments

parametersmeanstdmcseess_bulkess_tailrhatess_per_sec
p10.7002420.1380250.0125449122.567159.231.007missing
p20.7418920.1429790.0126691122.113170.531.01075missing
💾 CSV

Cumulative traces

For each iteration $i$, shows the running average up to $i$, $\frac{1}{i} \sum_{n = 1}^{i} x_n$.

🔍 Full page

Local communication barrier

When the global communication barrier is large, many chains may be required to obtain tempered restarts.

The local communication barrier can be used to visualize the cause of a high global communication barrier. For example, if there is a sharp peak close to a reference constructed from the prior, it may be useful to switch to a variational approximation.

🔍 Full page 🔗 Info

GCB estimation progress

Estimate of the Global Communication Barrier (GCB) as a function of the adaptation round.

The global communication barrier can be used to set the number of chains. The theoretical framework of Syed et al., 2021 yields that under simplifying assumptions, it is optimal to set the number of chains (the argument n_chains in pigeons()) to roughly 2Λ.

Last round estimate: $3.5523351664264387$

🔍 Full page 🔗 Info

Evidence estimation progress

Estimate of the log normalization (computed using the stepping stone estimator) as a function of the adaptation round.

Last round estimate: $-11.771971135020596$

🔍 Full page 🔗 Info

Round trips

Number of tempered restarts as a function of the adaptation round.

A tempered restart happens when a sample from the reference percolates to the target. When the reference supports iid sampling, tempered restarts can enable large jumps in the state space.

🔍 Full page 🔗 Info

Pigeons summary

roundn_scansn_tempered_restartsglobal_barrierglobal_barrier_variationallast_round_max_timelast_round_max_allocationstepping_stone
1201.03504missing0.05215812.18085e6-4235.05
2404.05573missing0.002297141.72003e6-16.2828
3803.4937missing0.0042613.41355e6-12.0918
41602.67657missing0.009054897.18346e6-10.2398
53204.2856missing0.01735061.31234e7-11.8208
66433.1656missing0.06037532.75211e7-11.467
712883.55783missing0.07404255.31849e7-11.4646
8256123.37682missing0.154721.06138e8-11.5921
9512373.47743missing0.2885442.13762e8-12.0006
101024773.55234missing0.5661584.2902e8-11.772
💾 CSV🔗 Info

Pigeons inputs

KeysValues
extended_tracesfalse
checked_round0
extractornothing
recordFunction[Pigeons.traces, Pigeons.round_trip, Pigeons.log_sum_ratio, Pigeons.timing_extrema, Pigeons.allocation_extrema]
multithreadedfalse
show_reporttrue
n_chains10
variationalnothing
explorernothing
n_chains_variational0
targetTuringLogPotential{...}(Model{typeof(PigeonsDynamicPPLExt.toy_turing_unid_model), (:n_trials, :n_successes), (), (), Tuple{Int64, Int64}, Tuple{}, DefaultContext}(PigeonsDynamicPPLExt.toy_turing_unid_model, (n_trials = 100000, n_successes = 50000), NamedTuple(), DefaultContext()), false)
n_rounds10
exec_foldernothing
referencenothing
checkpointfalse
seed1
💾 CSV🔗 Info

Reproducibility

run(`git clone https://github.com/Julia-Tempering/InferenceReport.jl`)
cd("InferenceReport.jl")
run(`git checkout 881ca9b392a4a59a37875104a8a5cf908ce155b3`)

using Pkg 
Pkg.activate(".")
Pkg.instantiate()
 

using Pigeons
inputs = Inputs(; target = Pigeons.toy_turing_unid_target(), n_rounds = 10, record = [traces; round_trip; record_default()])
 

pt = pigeons(inputs)