AdvancedVI v0.6.0
New Algorithms
This update adds new variational inference algorithms in light of the flexibility added in the v0.5 update.
Specifically, the following measure-space optimization algorithms have been added:
KLMinWassFwdBwdKLMinNaturalGradDescentKLMinSqrtNaturalGradDescentFisherMinBatchMatch
Interface Change (breaking)
The objective value returned by estimate_objective is now the value to be minimized by the algorithm.
For instance, for ELBO maximization algorithms, estimate_objective will return the negative ELBO.
Behavior Change
In addition, KLMinRepGradDescent, KLMinRepGradProxDescent, KLMinScoreGradDescent will now throw a RuntimeException if the objective value estimated at each step turns out to be degenerate (Inf or NaN). Previously, the algorithms ran until max_iter even if the optimization run had failed.
Merged pull requests:
- Move general reshuffling stuff into a separate file (#208) (@Red-Portal)
- Move
SubsampledNormalstest target to its own file undertest/models/(#209) (@Red-Portal) - Add the forward-backward Wasserstein Gaussian variational inference algorithm (#210) (@Red-Portal)
- Add natural gradient variational inference algorithms (#211) (@Red-Portal)
- Check that the objective value is finite in the shared
step(#212) (@Red-Portal) - Refactor algorithm unit tests (#213) (@Red-Portal)
- Add documentation for the natural gradient algorithms (#214) (@Red-Portal)
- Specify convention for
estimate_objective(#215) (@Red-Portal) - CompatHelper: bump compat for AdvancedVI to 0.5 for package bench, (keep existing compat) (#216) (@github-actions[bot])
- CompatHelper: bump compat for AdvancedVI to 0.5 for package docs, (keep existing compat) (#217) (@github-actions[bot])
- Batch-and-Match algorithm for minimizing the covariance-weighted Fisher divergence (#218) (@Red-Portal)
Closed issues:
- Natural Gradients + Monte Carlo VI (#1)