Releases: danielward27/flowjax
Releases · danielward27/flowjax
v7.0.0
What's Changed
- Improve documentation by @danielward27 in #57
- Imrpove documentation structure by @danielward27 in #58
- Use math instead of jnp in TanhLinearTails init by @danielward27 in #59
- Multiple ndim by @danielward27 in #61
#61 introduces important (and breaking) changes:
- Distributions and bijections now have shapes, rather than assuming everything is a vector with shape
(dim, )
. Distributions and bijections must define the attributes shape and cond_shape (see Distribution and Bijection). distribution.sample()
now has asample_shape: Tuple[int]
argument, replacingn: int
. Providingsample_shape=(n,)
is equivalent to the previousn=n
.- Transformers have been removed, instead Bijection types can be used directly as transformers. i.e.
Affine()
orRationalQuadraticSpline(**args)
should be used instead ofAffineTransformer()
,RationalQuadraticSplineTransformer(**args)
. RationalQuadraticSpline
arguments renamed toknots
andinterval
, rather thank
andB
- Rename
ScannableChain
toScan
. - Better documentation.
- Bumps version to 7.0.0
Full Changelog: v6.1.0...v7.0.0
v6.1.0
What's Changed
Added sphinx documentation
- Docs by @danielward27 in #54
- Update documentation.yml to use main by @danielward27 in #55
- Docs by @danielward27 in #56
Full Changelog: v6.0.0...v6.1.0
v6.0.0
Lots of new features, bug fixes, and breaking changes.
Main new features:
- New bijections and distributions.
- Added
ScannableChain
, which facilitates scanning (jax.lax.scan
) over flow layers, massively decreasing compile times.
Some key breaking changes from v5
:
- Changes in distribution parameterisation (e.g.
Normal
). - Flows changed to classes rather than functions (e.g.
coupling_flow
is nowCouplingFlow
, etc. - Removal of
intertwine_flip
andintertwine_permute
(tended to be less readable without saving many lines). - Changing of module names, and separating some functionality into new modules
flowjax.bijections.bnaf
->flowjax.bijections.block_autoregressive_network
flowjax.nn.bnaf
->flowjax.nn.block_autoregressive
- Neural networks now in
flowjax.nn
- Masks now in
flowjax.masks
What's Changed
- Affine by @danielward27 in #31
- Filter spec by @danielward27 in #32
- 29 add more distributions by @Tennessee-Wallaceh in #30
- Distributions update by @danielward27 in #33
- train_flow update by @danielward27 in #34
- Jit val loss by @danielward27 in #36
- Distribution broadcasting by @danielward27 in #37
- Add tanh bijector by @danielward27 in #38
- Export bijections by @danielward27 in #39
- Bijection tests by @danielward27 in #40
- Add basic bijections by @danielward27 in #41
- update docs by @danielward27 in #42
- Add preprocess bijection by @danielward27 in #43
- Triangular weight norm by @danielward27 in #44
- Remove preprocess and add scaling to FAQ by @danielward27 in #45
- Partial by @danielward27 in #46
- Support bools in partial by @danielward27 in #47
- Remove intertwine flip/permute by @danielward27 in #48
- Tidy by @danielward27 in #49
- Actions by @danielward27 in #50
- Recompilation by @danielward27 in #51
- Formatting by @danielward27 in #52
- Refactor by @danielward27 in #53
New Contributors
- @Tennessee-Wallaceh made their first contribution in #30
Full Changelog: v5.0.0...v6.0.0
v5.0.0
v4.0.2
New version (last release version in setup.py did not align with the tag, so it failed to publish).
v4.0.1
What's Changed
- Fix error for random permutations within 1 layer flows by @danielward27 in #27
Full Changelog: v4.0.0...v4.0.1
v4.0.0
What's Changed
BlockAutoregressiveNetwork
andBlockNeuralAutoregressiveFlow
arguments changed, using depth to refer to the number of hidden layers.- Re-implemented
CouplingFlow
to take an arbitraryTransformer
in #19 - Add in
MaskedAutoregressiveFlow
(to be used with an arbitraryTransformer
) in #18 and #20 - Updated documentation in #21
- Renamed
ParameterisedBijection
toTransformer
in #22 - Add
Transformed
andInvert
in #23Flow
class is now removed in favour of just usingTransformed
. In general,Transformed(base_dist, Invert(bijection))
will be equivalent to previous codeFlow(base_dist, bijection)
. The Invert is required as inTransformed
we take the "transform" direction of the bijection to implement sampling, and the "inverse" direction for use in density evaluations (the inverse of theFlow
implementation).
- Transformers have been renamed with a Transformer suffix in #24
- Update permute in #26
Full Changelog: v3.0.2...v4.0.0
v3.0.2
Update version
v3.0.1
Remove uneeded import
v3.0.0
What's Changed
- Clip grad by @danielward27 in #15
- Remove buffer for int arrays by @danielward27 in #16
- Improve bnaf parameterisation by @danielward27 in #17
Full Changelog: v2.0.0...v3.0.0