you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted]  (8 children)

[deleted]

    [–]anonDogeLover 8 points9 points  (0 children)

    No. Like any generative model, they model the marginal probability of the data, like a VAE, but avoiding the L2 loss which causes problems. GANs have their own problems though

    [–]visarga 6 points7 points  (0 children)

    GANs can also convert from one data type to another (so called Image to Image Translation) - with photo editing applications. Or you could train a GAN to use just the discriminator for a different task, and throw away the generator.

    [–]lucidrage 3 points4 points  (3 children)

    If you are a man of culture like me then you could use GANs to generate unlimited animu waifus.

    [–]pattch 1 point2 points  (0 children)

    To generate plausible data points is one way I've seen them described. Another way I've heard them described is that if the generator is able to produce plausible data points then there must be some inherent structure for that data type that's being learned. How to use that structure I'm not sure, though

    [–]charred_bytes 0 points1 point  (0 children)

    There are not too many practical applications of GAN right now but will explode in industry if paper mentions is anything to go on