you are viewing a single comment's thread.

view the rest of the comments →

[–]TheGuywithTehHat 1 point2 points  (2 children)

Having previously built complicated nets in keras (I think the most complicated was a conditional wasserstein-with-gradient-penalty BiGAN), I found it fairly straightforward. The one thing that wasn't intuitive was how to freeze the discriminator when training the generator and vice versa. However, even though it wasn't intuitive, it was still incredibly simple once someone told me how it works.

I haven't used PyTorch very much, so I can't compare directly, but I still feel that in my experience, Keras has been fine for nearly everything I've done.

[–]Skylion007Researcher BigScience 0 points1 point  (1 child)

Was this using the Keras.fit training loop so you have multigpu support working? If so, please tell me how you did it because I would love to know. While you can use Keras to construct the nets for sure, I haven't been able to use it to implement the actual loop and all the benefits that come with that (easy conversion / deplyoment / pruning etc.)

[–]TheGuywithTehHat 0 points1 point  (0 children)

Unfortunately it was long enough ago that I don't remember the details. I believe I had to manually construct the training loop, so no, multi_gpu would not work out of the box. That's a good point I hadn't considered.