Seed dependence of log_prob when using spline couplings

Hi all,

I have been having some issues getting repeatable log_prob outputs for the same input datasets when using models with spline couplings. The issue goes away when setting tensorflow global seeds just before getting the log_prob. However, I am unsure of any reason why this calculation should be seed dependent?

The differences seem to come when calling the DenseCouplingNet in the coupling layer. The only reason I can thing of is I am loading in the model with self.inference_net.load_weights (not using a summary network) from a saved model weights file and perhaps this isn’t sufficient and so adds some variability in the model?

Thanks in advance for your help.

Cheers
George

Does this also apply if you use tf.config.experimental.enable_op_determinism() before calling log_prob without a set seed?

Hi,

Yes I have tried this. To show you an example of this in practice I have quickly put together the following google colab notebook:

I doubt it will run for you as you need to mount my google drive with the model file in, but should give a clear understanding of what I am doing. If you wish to replicate I can email you the .h5 file.

Cheers,
George

Looks like dropout is causing the randomness. As far as I know, using mc_dropout=True enables dropout at evaluation time, so this would be the intended behaviour. Can you try a model that uses mc_dropout=False?

3 Likes

Thanks, this fixes it.

1 Like