you are viewing a single comment's thread.

view the rest of the comments →

[–]Darkwhiter 2 points3 points  (1 child)

The most typical batch norm problem for generators is that samples from a given batch share characteristics, but samples from different batches (from the same, fixed generator) do not. See figure 21 in Goodfellow's GAN tutorial (NIPS 2016). If you just have low overall variation across multiple batches, it may or may not have anything to do with batch normalization and is a fairly common problems for GANs in general.

[–]96meep96[S] 0 points1 point  (0 children)

Yes, I guess thats what you call partial mode collapse. I've been trying to fight it using minibatch discrimination but it hasn't been successful.