ConfigurationError in hierarchical model

Hi,
I’ve recently trying BayesFlow v1 for posterior estimation in a hierarchical model. The parameters I have are:
8 hyperparameters: \mu_1, \mu_2, \mu_3, \mu_4, \sigma_1, \sigma_2, \sigma_3, \sigma_4
and 4 local parameters: \theta_1, \theta_2, \theta_3, \theta_4
with relation: \theta_1~N(\mu_1, \sigma_1), also for others.
the observations are obtained from: y=function(\theta_1, \theta_2, \theta_3, \theta_4).
Here are my training data configuration:
for hyperparameters: 2000 X 8
for local parameters: 2000 X 20 X 4, which means I have 20 groups, each group has 4 local parameters
for observation: 2000 X 20 X 12
the data is 3D, thus I do not have to use a summary network for the first level, but still need a summary network for the second level to aggregate information from 20 groups. Below is my implementation:

summary_net = HierarchicalNetwork([ 
    DeepSet(summary_dim=128)
])

local_inference_net = InvertibleNetwork(
    num_params=4,num_coupling_layers=8,coupling_design='affine',
    coupling_settings=SETTINGS_POS,permutation="learnable", name="local_inference"
)

hyper_inference_net = InvertibleNetwork(
    num_params=8,num_coupling_layers=8,coupling_design='affine',
    coupling_settings=SETTINGS_POS,permutation="learnable", name="hyper_inference"
)

local_amortizer = AmortizedPosterior(local_inference_net, name="local_amortizer")
hyper_amortizer = AmortizedPosterior(hyper_inference_net, name="hyper_amortizer")
twolevel_amortizer = TwoLevelAmortizedPosterior(summary_net = summary_net,
                                                local_amortizer = local_amortizer,
                                                global_amortizer = hyper_amortizer)

I applied DeepSet to compress data (batchsize, 20, 12) into (batchsize, 128). then I use configurator and trianer as:

def configure_input_train(batch_size):
    out_dict = {}
    sim_data,hyper_draws, prior_draws = shearbuilding(128)
    # Add to keys
    out_dict["summary_conditions"] = sim_data 
    out_dict["hyper_parameters"] = hyper_draws
    out_dict["local_parameters"] = prior_draws 
    return out_dict

# configurator is used to connect data with amortizer
checkpoint_path="model_checkpoints/Four_modal_data20group"
trainer = Trainer(amortizer=twolevel_amortizer,generative_model=shearbuilding,configurator=configure_input_train,
                  checkpoint_path=checkpoint_path,max_to_keep=1)

sim_data,hyper_draws, prior_draws have a shape respectively (128,20,12), (128,8), (128,20,4).
However, it has error:

ConfigurationError: Could not carry out computations of generative_model ->configurator -> amortizer -> loss! Error trace:
 not enough values to unpack (expected 2, got 1)

I believe somewhere in configurator is wrong. It would be helpful if any comments are given.
Thanks!

Hi Jice,
could you please send a full reproducible example that produces the error? This would make it easier to trace what is going on and where the error might lie…

Hi Valentin,

Thanks for checking in. I’ve been trying to resolve the issue over the past few days and found the solution after reviewing some discussions on the forum. It turns out that the hierarchical model requires input data in a 4D shape—once I made that adjustment, it worked.

Additionally, I decided to use a custom summary network instead of DeepSets or transformers. In my case, those architectures were difficult to train, time-consuming, and did not yield good performance. Instead, I manually computed summary statistics that are more sensitive to the parameters, such as the mean, standard deviation, and interquartile range. From the results, I observed that estimating \sigma was more challenging compared to \mu and \theta.

By the way, do you happen to have any papers related to the use of hierarchical models in BayesFlow?

Best regards,
Jice

Nice that you got it to work. Regarding papers, you can take a look at Amortized Bayesian Multilevel Models by @Daniel and colleagues.