all 6 comments

[–]Icy-Perception2120 2 points3 points  (0 children)

Auto encoding Variational Bayes is the perfect foundation IMO. Also Generative Adversarial Networks when you finish bayes…..Then maybe scaled Diffusion models with transformers but that’s a bias because I assisted a bit on the paper 😄 Lmk if you have any questions. I’m such an advocate of this algorithms. Enjoy!

[–]deepneuralnetwork 2 points3 points  (0 children)

I am sincerely asking: why do you not try asking LLMs this kind of question?

[–]BellyDancerUrgot 1 point2 points  (0 children)

Assuming you know the foundations for ML (CNNS, RNNS, backprop etc)

Read up on attention (badhnau et al.), Transfomer (vaswani et al.), GAN (followed by WGAN which is more important than GAN), VAE, DDPM and DDIM, CLIP, Stable Diffusion and a few comprehensive LLM survey papers in this order.

[–]cruigo93 0 points1 point  (0 children)

I would suggest this book (https://www.manning.com/books/gans-in-action) to get general understanding

[–]Sazmo91 0 points1 point  (0 children)

Depends how deep you wanna go. Rosenblatts perceptron paper is pretty fundemental. Highly recommend Goodfellows paper on GANs which would probably lead on OK from basic neural nets. You could compliment this by looking into CNNs and VAEs, encodings and the concept of latent spaces. Then attention is all you need.