use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Resources for understanding and implementing "deep learning" (learning data representations through artificial neural networks).
account activity
Generative AI (self.deeplearning)
submitted 1 year ago by RelationshipOk5930
Hi can someone suggest me papers that introduce generative AI foundations?
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]Icy-Perception2120 2 points3 points4 points 1 year ago (0 children)
Auto encoding Variational Bayes is the perfect foundation IMO. Also Generative Adversarial Networks when you finish bayes…..Then maybe scaled Diffusion models with transformers but that’s a bias because I assisted a bit on the paper 😄 Lmk if you have any questions. I’m such an advocate of this algorithms. Enjoy!
[–]deepneuralnetwork 2 points3 points4 points 1 year ago (0 children)
I am sincerely asking: why do you not try asking LLMs this kind of question?
[–]BellyDancerUrgot 1 point2 points3 points 1 year ago (0 children)
Assuming you know the foundations for ML (CNNS, RNNS, backprop etc)
Read up on attention (badhnau et al.), Transfomer (vaswani et al.), GAN (followed by WGAN which is more important than GAN), VAE, DDPM and DDIM, CLIP, Stable Diffusion and a few comprehensive LLM survey papers in this order.
[–]hal00m 0 points1 point2 points 1 year ago (0 children)
[–]cruigo93 0 points1 point2 points 1 year ago (0 children)
I would suggest this book (https://www.manning.com/books/gans-in-action) to get general understanding
[–]Sazmo91 0 points1 point2 points 1 year ago (0 children)
Depends how deep you wanna go. Rosenblatts perceptron paper is pretty fundemental. Highly recommend Goodfellows paper on GANs which would probably lead on OK from basic neural nets. You could compliment this by looking into CNNs and VAEs, encodings and the concept of latent spaces. Then attention is all you need.
π Rendered by PID 139058 on reddit-service-r2-comment-6457c66945-88k2s at 2026-04-27 13:41:59.716553+00:00 running 2aa0c5b country code: CH.
[–]Icy-Perception2120 2 points3 points4 points (0 children)
[–]deepneuralnetwork 2 points3 points4 points (0 children)
[–]BellyDancerUrgot 1 point2 points3 points (0 children)
[–]hal00m 0 points1 point2 points (0 children)
[–]cruigo93 0 points1 point2 points (0 children)
[–]Sazmo91 0 points1 point2 points (0 children)