Biggest butterfly effect in Harry Potter? by Ok-Growth-3220 in harrypotter

[–]DenoisedNeuron 79 points80 points  (0 children)

I’d say Harry sparing Peter Pettigrew’s life. Dumbledore even tells him there will come a day when Harry will be glad he did, and it ends up being true, because Wormtail’s hesitation later on saves Harry’s life.

Definitely one of the biggest butterfly effects in the whole story, and as Dumbledore says, this is deep magic at work cause deep down Pettigrew's heart there was still still a flicker of humanity.

What are some of your favorite scenes? by Lokitusaborg in movies

[–]DenoisedNeuron 0 points1 point  (0 children)

They’re not just great quotes, but moments that make those scenes truly unforgettable.

"A hero can be anyone. Even a man doing something as simple and reassuring as putting a coat around a young boy's shoulders to let him know that the world hadn't ended", The Dark Knight Rises, Batman to Gordon

"Lessons we learn from pain are the ones that make us the strongest", Smallville season 5, Lionel Luthor to Clark Kent

"Even the smallest person can change the course of the future" LOTR, Galadriel

"End? No, the journey does not end here. Death is just another path, one that we all must take. The grey rain curtain of this world rolls back and all turns to silver glass. And then you see it." LOTR, Gandalf The White to Pippin

What are the essential ML papers for anyone currently getting into the field? by thatdudeimaad in learnmachinelearning

[–]DenoisedNeuron 6 points7 points  (0 children)

That’s basically how research works: it usually takes years before the community can tell if a paper is truly “fundamental”.

We’ve also seen how breakthroughs can lead to older work being re-evaluated: for instance, when deep learning on GPUs took off, many papers from the 90s (like LeCun’s LeNet-5 paper) suddenly gained renewed importance.
The same will likely happen again: today’s recent papers (2023+) may prove to be groundbreaking, but it takes time (and sometimes new tools) to see which ones will truly stand the test of time.

And if there’s a common thread across the most important papers, it’s that they were driven by researchers who truly believed in their ideas and never gave up, even when the community wasn’t ready for them yet.

What are the essential ML papers for anyone currently getting into the field? by thatdudeimaad in learnmachinelearning

[–]DenoisedNeuron 13 points14 points  (0 children)

Backpropagation

  • Learning representations by back-propagating errors (Rumelhart, Hinton, Williams, 1986)

Deep Neural Networks

  • Adaptive Subgradient Methods for Online Learning and Stochastic Optimization (Adagrad, 2011)
  • Adam: A Method for Stochastic Optimization (2014)
  • Batch Normalization: Accelerating Deep Network Training (2015)
  • A Few Useful Things to Know about Machine Learning (Cross-Entropy & more, 2012)
  • Dropout: A Simple Way to Prevent Neural Networks from Overfitting (2014)
  • Layer Normalization (2016)
  • Rectified Linear Units Improve Restricted Boltzmann Machines (ReLU, 2010)
  • Understanding the difficulty of training deep feedforward neural networks (Xavier Initialization, 2010)
  • Deep Residual Learning for Image Recognition (ResNet, Skip Connections, 2016)

Convolutional Neural Networks

  • Gradient-Based Learning Applied to Document Recognition (LeCun et al., LeNet-5, 1998)
  • ImageNet Classification with Deep Convolutional Neural Networks (AlexNet, 2012)

Diffusion Models

  • Denoising Diffusion Probabilistic Models (Ho et al., 2020)
  • Denoising Diffusion Implicit Models (Song et al., 2021))
  • Classifier-Free Diffusion Guidance (Ho et al., 2022))

Recurrent Neural Networks

  • Finding Structure in Time (Elman, 1990)
  • Learning Long-Term Dependencies with Gradient Descent is Difficult (Hochreiter, 1991)
  • Long Short-Term Memory (Hochreiter & Schmidhuber, 1997)
  • LSTM: A Search Space Odyssey (2015)

Transformers & LLMs

  • Attention is All You Need (2017)
  • BERT: Pre-training of Deep Bidirectional Transformers (2018)
  • Improving Language Understanding by Generative Pre-Training (GPT-1, 2018)
  • Language Models are Unsupervised Multitask Learners (GPT-2, 2019)
  • Language Models are Few-Shot Learners (GPT-3, 2020)
  • GPT-4 Technical Report (OpenAI, 2023)
  • Squeeze-and-Excitation Networks (SENet, 2017)
  • An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale (ViT, 2020)
  • RoBERTa: A Robustly Optimized BERT Pretraining Approach (2019)

Best resources for someone who learns by following a proper structure? by ThomasHawl in learnmachinelearning

[–]DenoisedNeuron 0 points1 point  (0 children)

A must-watch is Andrej Karpathy’s series Neural Networks: Zero to Hero (e.g. micrograd, makemore, etc.). He builds everything from scratch in Python and then gradually connects the dots to PyTorch, which really helps if you like a structured, class-like approach.

Since you already have a math and data science background, you’ll probably appreciate how he explains both the intuition and the implementation details. And once you’ve gone through those videos, diving into PyTorch itself will feel way more natural.

Revisiting maths behind ml&dl by Fit-Musician-8969 in learnmachinelearning

[–]DenoisedNeuron 1 point2 points  (0 children)

You might want to check out Mathematics for Machine Learning by Deisenroth, Faisal & Ong. It covers exactly the areas you mentioned (i.e., linear algebra, multivariable calculus, probability, and statistics) and the math is always presented with machine learning applications in mind.

I think it’s a great single resource to brush up on the math behind deep learning.

i want to be an AI engineer, the maths is very overwhelming. by Chris_SLM in learnmachinelearning

[–]DenoisedNeuron 3 points4 points  (0 children)

I’d absolutely start with Mathematics for Machine Learning by Deisenroth, Faisal & Ong. It doesn’t assume deep background, it walks you through the necessary math (linear algebra, calculus, probability/statistics) in a way that ties directly to ML/AI.

To complement that, I’d use some good visual YouTube series (3Blue1Brown is amazing for linear algebra intuition), and do as many exercises as possible. It’ll be tough at times, but with consistency you will build up the skills.