Hello!
I am using BayesFlow for the first time, and having some issues doing offline training with a summary network (everything works fine if I exclude it and just supply the data to the inference network). Basically, I am simulating from an agent-based model in R, and then bringing the data (13 parameters and 1921 output features) into Python so that it has the format required by BayesFlow.
{'prior_non_batchable_context': None, 'prior_batchable_context': None, 'prior_draws': array([[ 0.76718193, -0.87377334, 0.22366658, ..., -0.39844744,
-0.35801412, 0.04897532],
[ 1.30669447, 0.66729612, 1.7323655 , ..., -1.21459231,
0.74958571, -0.25844436],
[ 0.90680171, 1.29136708, 0.70601619, ..., -1.28499242,
0.98797805, 0.6543645 ],
...,
[-1.57379467, -0.29633931, -0.97821418, ..., -2.09261737,
-0.91399776, 1.30333936],
[ 0.45297908, -0.91439699, 0.79108664, ..., -0.16306107,
-0.50990459, -0.60457136],
[-0.80346153, 0.05948135, -1.28584127, ..., 0.532937 ,
0.54166668, -0.12752289]]), 'sim_non_batchable_context': None, 'sim_batchable_context': None, 'sim_data': array([[ 0.44540442, 0.40116571, -0.54365648, ..., -0.01023475,
-0.01002833, -0.01001542],
[-0.84753774, -1.34450615, -1.15397759, ..., -0.01023475,
-0.01002833, -0.01001542],
[ 0.44540442, 0.40116571, 1.01192892, ..., -0.01023475,
-0.01002833, -0.01001542],
...,
[-0.19984821, -0.77635165, -1.15397759, ..., -0.01023475,
-0.01002833, -0.01001542],
[-0.60475894, -0.00569413, -1.00236414, ..., -0.01023475,
-0.01002833, -0.01001542],
[ 0.44540442, 0.40116571, 1.01192892, ..., -0.01023475,
-0.01002833, -0.01001542]])}
Here is my configure input
script:
#define function that configures the input for the amortizer
def configure_input(forward_dict):
#prepare placeholder dict
out_dict = {}
#add to keys
out_dict["parameters"] = forward_dict["prior_draws"].astype(np.float32)
out_dict["summary_conditions"] = forward_dict["sim_data"].astype(np.float32)
out_dict["direct_conditions"] = None
#return the output dictionary
return out_dict
Then, if I run the following code:
summary_net = bf.networks.DeepSet()
inference_net = bf.networks.InvertibleNetwork(num_params = 13)
amortized_posterior = bf.amortizers.AmortizedPosterior(inference_net, summary_net)
trainer = bf.trainers.Trainer(amortizer = amortized_posterior, configurator = configure_input, memory = True)
offline_training = trainer.train_offline(simulations_dict = data, epochs = 2, batch_size = 32)
I get the following error:
ValueError: Exception encountered when calling layer 'sequential_549' (type Sequential).
Input 0 of layer "dense_1530" is incompatible with the layer: expected min_ndim=2, found ndim=1. Full shape received: (64,)
Call arguments received by layer 'sequential_549' (type Sequential):
• inputs=tf.Tensor(shape=(64,), dtype=float32)
• training=True
• mask=None
Do you know what the issue might be?