Roadmap 0.1.0
- [ ] Documentation
- [ ] Sample (selected) deterministic variables
- [ ] HMC from blackjax-devs/blackjax
- [ ] NUTS from blackjax-devs/blackjax
- [ ] Effective sample size from blackjax-devs/blackjxa
- [ ] Models are distirbutions
- [ ] Fix bug with rng keys in forward sampling #90
- [ ] Refactor the core (#4)
- [ ] #98
- [ ] document the core extensively
- [x] Rhat (online estimation)
- [ ] Average acceptance rate
- [x] Number of divergences
- [ ] Inference summary
What it is not about:
- Warmup using Adam instead of Dual Averaging
- Dynamical HMC
- Discontinuous HMC
- Riemannian manifold Hamiltonian Monte Carlo
- Metropolis-within-Gibbs
- Particle Filtering / Sequential Monte Carlo
- HMC augmented with NN
- HMC with Neural Transport
- More distributions
- Stochastic processes
- Simulation-based calibration
- Population predictive check
- Rao-Blackwellization
- Automatic reparametrization
- Bayesian Neural Networks (by subclassing trax) -> 4d5305a5433211624e1e2feaac541511d4ae59f9
- Measure effective sample size with regenerative methods
- Stochastic gradient HMC which could be the first use of a sequential sampler. Also can be used for stochastic conditioning.
Hi @rlouf , I was searching for the do-operator recently, and it seems it got removed at some point.
Are there any plans to add it back? It is marked as "done" in the roadmap above. I can hack around it, but it would be super useful for some code examples in Rethinking :sweat_smile:
I took it off because the implementation would be a lot different; the design I had in my was that when we use .do(a=10) the variable a now becomes an argument to every target function and we partially apply these with a's value. Not as simple as before but I would definitely appreciate a PR adding this to MCX.