Hi,
I’ve recently trying BayesFlow v1 for posterior estimation in a hierarchical model. The parameters I have are:
8 hyperparameters: \mu_1, \mu_2, \mu_3, \mu_4, \sigma_1, \sigma_2, \sigma_3, \sigma_4
and 4 local parameters: \theta_1, \theta_2, \theta_3, \theta_4
with relation: \theta_1~N(\mu_1, \sigma_1), also for others.
the observations are obtained from: y=function(\theta_1, \theta_2, \theta_3, \theta_4).
Here are my training data configuration:
for hyperparameters: 2000 X 8
for local parameters: 2000 X 20 X 4, which means I have 20 groups, each group has 4 local parameters
for observation: 2000 X 20 X 12
the data is 3D, thus I do not have to use a summary network for the first level, but still need a summary network for the second level to aggregate information from 20 groups. Below is my implementation:
summary_net = HierarchicalNetwork([
DeepSet(summary_dim=128)
])
local_inference_net = InvertibleNetwork(
num_params=4,num_coupling_layers=8,coupling_design='affine',
coupling_settings=SETTINGS_POS,permutation="learnable", name="local_inference"
)
hyper_inference_net = InvertibleNetwork(
num_params=8,num_coupling_layers=8,coupling_design='affine',
coupling_settings=SETTINGS_POS,permutation="learnable", name="hyper_inference"
)
local_amortizer = AmortizedPosterior(local_inference_net, name="local_amortizer")
hyper_amortizer = AmortizedPosterior(hyper_inference_net, name="hyper_amortizer")
twolevel_amortizer = TwoLevelAmortizedPosterior(summary_net = summary_net,
local_amortizer = local_amortizer,
global_amortizer = hyper_amortizer)
I applied DeepSet to compress data (batchsize, 20, 12) into (batchsize, 128). then I use configurator and trianer as:
def configure_input_train(batch_size):
out_dict = {}
sim_data,hyper_draws, prior_draws = shearbuilding(128)
# Add to keys
out_dict["summary_conditions"] = sim_data
out_dict["hyper_parameters"] = hyper_draws
out_dict["local_parameters"] = prior_draws
return out_dict
# configurator is used to connect data with amortizer
checkpoint_path="model_checkpoints/Four_modal_data20group"
trainer = Trainer(amortizer=twolevel_amortizer,generative_model=shearbuilding,configurator=configure_input_train,
checkpoint_path=checkpoint_path,max_to_keep=1)
sim_data,hyper_draws, prior_draws have a shape respectively (128,20,12), (128,8), (128,20,4).
However, it has error:
ConfigurationError: Could not carry out computations of generative_model ->configurator -> amortizer -> loss! Error trace:
not enough values to unpack (expected 2, got 1)
I believe somewhere in configurator is wrong. It would be helpful if any comments are given.
Thanks!