Description
Consider a Bayesian model where the likelihood is a binomial distribution with probability parameter $p$. Let us consider an over-parameterized model where we write $p = p_1 p_2$. Assume that each $p_i$ has a uniform prior on the interval $[0, 1]$.
In summary:
\[\begin{aligned} p_1 &\sim \text{Unif}(0, 1) \\ p_2 &\sim \text{Unif}(0, 1) \\ y &\sim \text{Bin}(n, p_1 p_2) \end{aligned}\]
Here we use the values:
Parameter | Value |
---|---|
Number of trials $n$ | 100000 |
Number of successes $y$ | 50000 |
This is a toy example of an unidentifiable parameterization. In practice many popular Bayesian models are unidentifiable.
When there are many observations, the posterior of unidentifiable models concentrate on a sub-manifold, making sampling difficult, as shown in the following pair plots.
Pair plot
Diagonal entries show estimates of the marginal densities as well as the (0.16, 0.5, 0.84) quantiles (dotted lines). Off-diagonal entries show estimates of the pairwise densities.
Movie linked below (🍿) superimposes 100 iterations of MCMC.
🔍 Full page ⏐🍿 Movie ⏐🔗 InfoTrace plots
🔍 Full pageMoments
parameters | mean | std | mcse | ess_bulk | ess_tail | rhat | ess_per_sec |
---|---|---|---|---|---|---|---|
p1 | 0.700242 | 0.138025 | 0.0125449 | 122.567 | 159.23 | 1.007 | missing |
p2 | 0.741892 | 0.142979 | 0.0126691 | 122.113 | 170.53 | 1.01075 | missing |
Cumulative traces
For each iteration $i$, shows the running average up to $i$, $\frac{1}{i} \sum_{n = 1}^{i} x_n$.
🔍 Full pageLocal communication barrier
When the global communication barrier is large, many chains may be required to obtain tempered restarts.
The local communication barrier can be used to visualize the cause of a high global communication barrier. For example, if there is a sharp peak close to a reference constructed from the prior, it may be useful to switch to a variational approximation.
🔍 Full page ⏐🔗 InfoGCB estimation progress
Estimate of the Global Communication Barrier (GCB) as a function of the adaptation round.
The global communication barrier can be used to set the number of chains. The theoretical framework of Syed et al., 2021 yields that under simplifying assumptions, it is optimal to set the number of chains (the argument n_chains
in pigeons()
) to roughly 2Λ.
Last round estimate: $3.5523351664264387$
🔍 Full page ⏐🔗 InfoEvidence estimation progress
Estimate of the log normalization (computed using the stepping stone estimator) as a function of the adaptation round.
Last round estimate: $-11.771971135020596$
🔍 Full page ⏐🔗 InfoRound trips
Number of tempered restarts as a function of the adaptation round.
A tempered restart happens when a sample from the reference percolates to the target. When the reference supports iid sampling, tempered restarts can enable large jumps in the state space.
🔍 Full page ⏐🔗 InfoSwaps plot
🔍 Full pagePigeons summary
round | n_scans | n_tempered_restarts | global_barrier | global_barrier_variational | last_round_max_time | last_round_max_allocation | stepping_stone |
---|---|---|---|---|---|---|---|
1 | 2 | 0 | 1.03504 | missing | 0.0218598 | 1.19506e6 | -4235.05 |
2 | 4 | 0 | 4.05573 | missing | 0.00213898 | 1.72131e6 | -16.2828 |
3 | 8 | 0 | 3.4937 | missing | 0.00380427 | 3.41611e6 | -12.0918 |
4 | 16 | 0 | 2.67657 | missing | 0.00790396 | 7.18858e6 | -10.2398 |
5 | 32 | 0 | 4.2856 | missing | 0.0420884 | 1.31336e7 | -11.8208 |
6 | 64 | 3 | 3.1656 | missing | 0.0293804 | 2.75416e7 | -11.467 |
7 | 128 | 8 | 3.55783 | missing | 0.0659645 | 5.32259e7 | -11.4646 |
8 | 256 | 12 | 3.37682 | missing | 0.14212 | 1.0622e8 | -11.5921 |
9 | 512 | 37 | 3.47743 | missing | 0.285916 | 2.13926e8 | -12.0006 |
10 | 1024 | 77 | 3.55234 | missing | 0.543977 | 4.29348e8 | -11.772 |
Pigeons inputs
Keys | Values |
---|---|
extended_traces | false |
checked_round | 0 |
extractor | nothing |
record | Function[Pigeons.traces, Pigeons.round_trip, Pigeons.log_sum_ratio, Pigeons.timing_extrema, Pigeons.allocation_extrema] |
multithreaded | false |
show_report | true |
n_chains | 10 |
variational | nothing |
explorer | nothing |
n_chains_variational | 0 |
target | TuringLogPotential{...}(Model{typeof(PigeonsDynamicPPLExt.toy_turing_unid_model), (:n_trials, :n_successes), (), (), Tuple{Int64, Int64}, Tuple{}, DefaultContext}(PigeonsDynamicPPLExt.toy_turing_unid_model, (n_trials = 100000, n_successes = 50000), NamedTuple(), DefaultContext()), DefaultContext(), 2) |
n_rounds | 10 |
exec_folder | nothing |
reference | nothing |
checkpoint | false |
seed | 1 |
Reproducibility
run(`git clone https://github.com/Julia-Tempering/InferenceReport.jl`)
cd("InferenceReport.jl")
run(`git checkout f26f12430f67a2c8c64988c8c8c1d80696a7d7fd`)
using Pkg
Pkg.activate(".")
Pkg.instantiate()
using Pigeons
inputs = Inputs(; target = Pigeons.toy_turing_unid_target(), n_rounds = 10, record = [traces; round_trip; record_default()])
pt = pigeons(inputs)