After I installed tensorflow-metal, I saw a huge increase in training time on macOS. As you might already know, I get the following warning:
“2.11+ optimizer tf.keras.optimizers.Adam runs slowly on M1/M2 Macs, please use the legacy Keras optimizer instead, located at tf.keras.optimizers.legacy.Adam”.
But can we use the legacy optimizer? I tried to implement it using the following:
optimizer= tf.keras.optimizers.legacy.Adam
history = trainer.train_online(epochs=10, iterations_per_epoch=1000, optimizer=optimizer, batch_size=32, validation_sims=200)
but I got an error. I have attached the model, and I appreciate your feedback in advance. Thanks a lot.
Hi,
as far as I can tell the problem is that in the first line you reference the tf.keras.optimizers.legacy.Adam class, but you don’t instantiate it. If you call the constructor by appending (), it should work. Full code:
optimizer= tf.keras.optimizers.legacy.Adam()
history = trainer.train_online(epochs=10, iterations_per_epoch=1000, optimizer=optimizer, batch_size=32, validation_sims=200)
Thanks. That was indeed the issue. But apparently tf.keras.optimizers.legacy.Adam does not do a good job using the metal plugin, and it has been reported elsewhere. The training actually diverges using GPU. This is the results from the same code using the legacy function:
So, it is a known issue, and it is not worth trying at this time, in case other users are interested.
By the way, I created a new environment to clean install the packages and I found out that some of the main packages are upldated, creating dependency conflicts with BayesFlow. I actually used pip install bayesflow but when I ran the same code I got the following warning, which later on turned out to an error:
It seems that installing BayesFlow would install TensorFlow 2.15 but TensorFlow-probability 0.24.0, which is compatible with TensorFlow 2.16. But updating TensorFlow would need the updated tf_keras
This is resolved by just switching back to tensorflow-probability 0.23.0. Just wanted to document my experiment that might be helpful for future releases.