Confused in training `AmortizedPosteriorEstimator`

I am fairly new to Bayesflow and BayesianFlowNetworks in general, I was previously developing a Bayesian Neural Network using tensorflow_probability but moved to BFN once I saw Bayesflow.

I have simulated data related to Astrophysics that is stored on the disk which is actually a dataset with features and corresponding labels as in usual deep learning,

I have a few questions, for each of them I have gone through the docs but wasn’t able to clear myself on the matter:

  1. While training, using the trainer.train_offline(), what is the simulation_dict supposed to be, should it only contain my X_train, or is it supposed to contain both X_train, y_train, or y_train only, consequently what is prior_draws in this context, is it related to y_train or is it only the draws from prior object I defined,

  2. I have a 3D object as input for each example and a 1D vector (len=3) as output labes, so with batch_size the exact shape is (batch_size, 100, 1000, 2) for each input, how do I incorporate this in a DeepSet model, it seems to only accept 2D inputs, not 3D.

Hi, welcome to the BayesFlow Forums!

Just to clear up any misunderstandings: BayesFlow does not (currently) implement Bayesian Flow Networks (BFN; [2308.07037] Bayesian Flow Networks). BFNs are quite new and the name collision is an unfortunate coincidence.

BayesFlow mainly implements normalizing flows (hence the name) and flow matching for conditional density estimation. You can use BayesFlow for (Bayesian) parameter estimation or model comparison.

You can find more information here:

Hope that helps. Cheers,
Marvin

1 Like

Thankyou for taking out time to reply, I really appreciate that. Could you please respond to my second question too? I have figured out the first one but would appreciate a response on that too Thanks again

Since your summary network needs to reduce a 3D tensor to a 1D vector for conditioning, you can use the HierarchicalNetwork interface, e.g.,:

summary_net = bf.networks.HierarchicalNetwork([
    bf.networks.DeepSet(summary_dim=16), 
    bf.networks.DeepSet(summary_dim=64)
])

This network will successively reduce a 4D batch into a 2D batch. Note, however, that it assumes exchangeability (i.e., IID data points) for each of the axes. If that is not desirable, you can use a backbone network different than a DeepSet.

Let us know if that helps!

2 Likes

Thanks for your reply. I have found my way through the problem using a mix of Conv2D layes and SequenceNetwork. It is working pretty well, I have also saved the trained checkpoints for summary network,

However for the inference network, I am not sure if that can be saved or not. I can’t find anything on Bayesflow docs or forum about that. Does it not need to be saved? My inference network precisely is as follows:

self.inference_network = InvertibleNetwork(
            num_params=8,  # parameters to estimate
            num_coupling_layers=8,
            use_act_norm=True,
            coupling_design="affine",
            permutation="learnable",
            coupling_settings={"mc_dropout" : True, "dense_args" : dict(units=128, activation="elu")}
        )

Other than this is there anyway we can load the saved models without starting the training phase?

(just btw I am following the quick amortized posterior estimation guide as a baseline)

Thanks in advance

Hi,

BayesFlow currently takes care of saving/loading all networks contained in your Amortizer instance via the Trainer instance. You can find an example in this tutorial notebook in the “Defining the Trainer” section. If checkpoints are already present at the checkpoint path, initializing the trainer loads them and you can proceed with your pre-trained networks without any further training.

Best,
Lasse

1 Like

Hey, Thankyou so much for the reply, I was missing the checkpoint_path parameter in defining the trainer while giving this in trainer.train_offline and was wondering why is it not working. It works.

I wholeheartedly thank you guys for the framework and to the amazing team you have.

1 Like