NuX
Normalizing Flows using JAX
view repo
Normalizing flows provide a general mechanism for defining expressive probability distributions, only requiring the specification of a (usually simple) base distribution and a series of bijective transformations. There has been much recent work on normalizing flows, ranging from improving their expressive power to expanding their application. We believe the field has now matured and is in need of a unified perspective. In this review, we attempt to provide such a perspective by describing flows through the lens of probabilistic modeling and inference. We place special emphasis on the fundamental principles of flow design, and discuss foundational topics such as expressive power and computational trade-offs. We also broaden the conceptual framing of flows by relating them to more general probability transformations. Lastly, we summarize the use of flows for tasks such as generative modeling, approximate inference, and supervised learning.
READ FULL TEXT
Normalizing flows have received a great deal of recent attention as they...
read it
Generative flows are promising tractable models for density modeling tha...
read it
Normalizing flows model complex probability distributions by combining a...
read it
The core contribution is to propose a probabilistic forecast-driven stra...
read it
Normalizing flows are a powerful tool for building expressive distributi...
read it
Normalizing flows have emerged as an important family of deep neural net...
read it
Normalising flows are tractable probabilistic models that leverage the p...
read it
Normalizing Flows using JAX
None
Implementations of normalising flows using nflows
A continuation of stochastic-modeling, examining Normalizing Flows and probability density approximation
Normalizing flows in PyTorch
Comments
There are no comments yet.