Discussion of Normalizing flow

Hi,
I have applied the BayesFlow framework to my area, and the performance has been fairly good so far. The mechanism of BayesFlow relies on the use of normalizing flow (NF). Here, I would like to discuss some aspects of NF, and I welcome any comments, suggestions, and papers.

A crucial limitation of NF is the substantial memory requirement, which arises from maintaining the dimension of the latent space equal to that of the input space. I am seeking advice on addressing the curse of dimensionality in NFs. One possible solution is to apply dimensionality reduction before using NFs, such as using an encoder network.

Thanks!

Hi Jice, you can try using flow matching ([2305.17161] Flow Matching for Scalable Simulation-Based Inference) or consistency models ([2312.05440] Consistency Models for Scalable and Fast Simulation-Based Inference) which are all examples of free-form architecture.

I don’t have experience with applications that perform dimensionality reduction in parameter space, as the typical mode of operation is one where all model parameters are of scientific interest and cannot / should not be reducible. However, you can always pre-train an autoencoder that further (conditionally) compress the parameters and then use the latter to train the NF. Is that what you have in mind?

1 Like

Hi Stefan
Thank you for suggesting the paper. The methods presented are advanced and effectively remove constraints in traditional normalizing flows (NFs), enhancing their expressivity while still maintaining the same dimensions for the input and latent spaces.

I have experimented with applying dimensionality reduction techniques as a preprocessing step before training the NF model, and it works well. Additionally, combining variational autoencoders (VAEs) with NFs has shown promising results. However, I am still curious if there is a solution within the NF framework itself that addresses this issue without relying on dimensionality reduction techniques

From my literature review, I have found that some researchers have proposed injective flows ([2102.10461] Trumpets: Injective Flows for Inference and Inverse Problems), surjective flows (Journal of Open Source Software: Surjectors: surjection layers for density estimation with normalizing flows), and GAN-Flow ([2310.04690] A dimension-reduced variational approach for solving physics-based inverse problems using generative adversarial network priors and normalizing flows). It seems these methods are not widely adopted yet. I am particularly confused about the performance differences between injective and surjective flows and which might be better. Moreover, I wonder about their potential side effects, as I haven’t found any evidence comparing injective and surjective flows.

Does anyone have better ideas or insights on this matter?

Thanks!

Jice

Normalizing Flows are inherently not suited well to problems where the data lie on a low-dimensional manifold [1]. External dimensionality reduction seems to be your best bet, currently. I would recommend following the general approach of [2] which is to use a deterministic encoder-decoder architecture and apply your generative model on the latent space of the encoder. Diffusion and Flow Matching are particularly well-suited due to their free-form Jacobian, but you can also try [3] if you want to stick to NFs.

2 Likes

Thanks for the comments. You are right. NF is not a good option for high-dimensional problems. The straightforward way is to apply the dimension reduction externally.
Do you have any other suggestions for high-dimensional inverse problems except using NF?
Best,

Jice

Hi Jice,

You could try one of the following options:

Conditional Flow Matching

Conditional Consistency Models

  • Paper: [2312.05440] Consistency Models for Scalable and Fast Simulation-Based Inference
  • BayesFlow implementation will come in BayesFlow 2.0 as well (not yet implemented in pre-alpha), with an interface like bf.networks.ConsistencyModel()
  • If you’re interested in preliminary code for consistency models, let me know and I’ll share the development repo in the “old” BayesFlow API with you.

Cheers!

1 Like

Hi Marvin,
Thanks for the suggestion! I’m excited to hear that BayesFlow is continuously developing. Is the BayesFlow link GitHub - stefanradev93/BayesFlow: A Python library for amortized Bayesian workflows using generative neural networks. a pre-alpha version that includes bayesflow.experimental.rectifiers and bf.networks.FlowMatching()?
Thanks!

Jice