Enabling multi‐GPU (2× GPUs) training in BayesFlow

Hi BayesFlow team,

I’m running BayesFlow v1’s train_online on a single GPU but would like to leverage two GPUs (for both simulation-heavy and neural network phases).

Does the Trainer offer built in data, or model parallel multi GPU support?

If not, what’s the recommended way to wrap the simulation loop and amortizer training to run across multiple GPUs?

Any example code for multi-GPU BayesFlow training would be greatly appreciated.

Hi Noura, I am not sure why the post was stuck and needed extra steps, but I fixed it now. Sorry for the delay. Unfortunately, the Trainer in v1 does not have out of the box support for multi-GPU training. Multi-GPU training is something we are currently working on for v2, so I am linking @LarsKue.

1 Like

v2 should already support both model and data parallel to some degree, using JAX. I will add an example once I find the time.

1 Like