Hi everyone,
I have been using BayesFlow 1.0 and 2.0 in my work, and they have performed very well for my studies. Recently, I started revisiting how generative models actually incorporate conditioning on data/observations.
In conditional INNs, we typically concatenate or inject the observed data into the affine coupling layers that map between parameters and latent variables. However, I noticed that there are two other conditioning strategies—such as augmented latent spaces and latent-mixture INNs (shown in the screenshot on the left and middle).
I have been trying to understand the differences among these three conditioning mechanisms, as well as when and why one should use each approach. Unfortunately, I have not been able to find references or papers that specifically discuss augmented latent-space conditioning or latent-mixture INNs.
Does anyone know of relevant references, papers, or implementations that explore these two conditioning schemes?
Thanks in advance!
